datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
daje/tokenized_kowiki
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1656585668 num_examples: 1706411 download_size: 682692770 dataset_size: 1656585668 --- # Dataset Card for "tokenized_kowiki" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Resizable/ECKOSOLDIER
--- license: openrail ---
Nerfgun3/sam_yang
--- language: - en tags: - stable-diffusion - text-to-image license: creativeml-openrail-m inference: false --- # Sam Yang Artist Embedding / Textual Inversion ## Usage To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder To use it in a prompt: ```"drawn by sam_yang"``` If it is to strong just add [] around it. Trained until 5000 steps Have fun :) ## Example Pictures <table> <tr> <td><img src=https://i.imgur.com/cbtBjwH.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/r5s8bSO.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/NpGj5KU.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/eWJlaf5.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/DOJvxTJ.png width=100% height=100%/></td> </tr> </table> ## License This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies: 1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content 2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license 3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) [Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license)
ceefax/drms_1
--- dataset_info: features: - name: input_ids dtype: int32 - name: attention_mask dtype: float32 - name: token_type_ids dtype: float32 splits: - name: train num_bytes: 1909524 num_examples: 159127 download_size: 509062 dataset_size: 1909524 --- # Dataset Card for "drms_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vivekdugale/llama2_chat_mental_health_convo_amod_3.51k
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 4713396 num_examples: 3512 download_size: 2567536 dataset_size: 4713396 configs: - config_name: default data_files: - split: train path: data/train-* ---
Falah/village4kids_1_prompts
--- dataset_info: features: - name: prompts dtype: string splits: - name: train num_bytes: 2723 num_examples: 11 download_size: 2840 dataset_size: 2723 --- # Dataset Card for "village4kids_1_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ruliad/factual-expert-processed
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 410291 num_examples: 10 download_size: 242093 dataset_size: 410291 configs: - config_name: default data_files: - split: train path: data/train-* ---
RealTimeData/code_alltime
--- dataset_info: - config_name: 2017-01 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 27818809 num_examples: 498 download_size: 8186744 dataset_size: 27818809 - config_name: 2017-02 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 9062731 num_examples: 583 download_size: 2750217 dataset_size: 9062731 - config_name: 2017-03 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 12635897 num_examples: 432 download_size: 3699162 dataset_size: 12635897 - config_name: 2017-04 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 9801445 num_examples: 515 download_size: 2932006 dataset_size: 9801445 - config_name: 2017-05 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 5356773 num_examples: 486 download_size: 1857771 dataset_size: 5356773 - config_name: 2017-06 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 5851432 num_examples: 449 download_size: 1632423 dataset_size: 5851432 - config_name: 2017-07 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 9499347 num_examples: 471 download_size: 2807477 dataset_size: 9499347 - config_name: 2017-08 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 8017756 num_examples: 512 download_size: 2169638 dataset_size: 8017756 - config_name: 2017-09 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 7307297 num_examples: 439 download_size: 2659164 dataset_size: 7307297 - config_name: 2017-10 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 28920173 num_examples: 596 download_size: 8740153 dataset_size: 28920173 - config_name: 2017-11 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 9053832 num_examples: 450 download_size: 3111540 dataset_size: 9053832 - config_name: 2017-12 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 10601028 num_examples: 566 download_size: 3112609 dataset_size: 10601028 - config_name: 2018-01 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 26555392 num_examples: 436 download_size: 7651268 dataset_size: 26555392 - config_name: 2018-02 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 8199606 num_examples: 546 download_size: 2791025 dataset_size: 8199606 - config_name: 2018-03 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 31958551 num_examples: 473 download_size: 7160895 dataset_size: 31958551 - config_name: 2018-04 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 27846854 num_examples: 431 download_size: 8187476 dataset_size: 27846854 - config_name: 2018-05 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 5913046 num_examples: 485 download_size: 1997067 dataset_size: 5913046 - config_name: 2018-06 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 4944199 num_examples: 413 download_size: 1554876 dataset_size: 4944199 - config_name: 2018-07 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 7423297 num_examples: 500 download_size: 2460992 dataset_size: 7423297 - config_name: 2018-08 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 5012280 num_examples: 471 download_size: 1565885 dataset_size: 5012280 - config_name: 2018-09 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 11567458 num_examples: 534 download_size: 3631200 dataset_size: 11567458 - config_name: 2018-10 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 3960621 num_examples: 469 download_size: 1189681 dataset_size: 3960621 - config_name: 2018-11 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 5787805 num_examples: 456 download_size: 1658984 dataset_size: 5787805 - config_name: 2018-12 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 80257123 num_examples: 564 download_size: 15409397 dataset_size: 80257123 - config_name: 2019-01 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 10884294 num_examples: 559 download_size: 3443301 dataset_size: 10884294 - config_name: 2019-02 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 5649301 num_examples: 440 download_size: 1797649 dataset_size: 5649301 - config_name: 2019-03 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 18363947 num_examples: 587 download_size: 5020308 dataset_size: 18363947 - config_name: 2019-04 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 6694701 num_examples: 551 download_size: 2107122 dataset_size: 6694701 - config_name: 2019-05 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 26803692 num_examples: 561 download_size: 6332158 dataset_size: 26803692 - config_name: 2019-06 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 7680262 num_examples: 626 download_size: 2388316 dataset_size: 7680262 - config_name: 2019-07 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 48256751 num_examples: 782 download_size: 10976769 dataset_size: 48256751 - config_name: 2019-08 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 44531740 num_examples: 621 download_size: 9532153 dataset_size: 44531740 - config_name: 2019-09 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 13659784 num_examples: 633 download_size: 4329485 dataset_size: 13659784 - config_name: 2019-10 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 14255546 num_examples: 641 download_size: 4620728 dataset_size: 14255546 - config_name: 2019-11 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 4447318 num_examples: 481 download_size: 1369815 dataset_size: 4447318 - config_name: 2019-12 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 8185509 num_examples: 674 download_size: 2414564 dataset_size: 8185509 - config_name: 2020-01 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 8455425 num_examples: 550 download_size: 2711023 dataset_size: 8455425 - config_name: 2020-02 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 13324957 num_examples: 647 download_size: 4004628 dataset_size: 13324957 - config_name: 2020-03 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 8637049 num_examples: 641 download_size: 2618621 dataset_size: 8637049 - config_name: 2020-04 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 21680797 num_examples: 523 download_size: 6186771 dataset_size: 21680797 - config_name: 2020-05 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 33247689 num_examples: 745 download_size: 9491599 dataset_size: 33247689 - config_name: 2020-06 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 32091028 num_examples: 650 download_size: 9477554 dataset_size: 32091028 - config_name: 2020-07 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 11260724 num_examples: 648 download_size: 3516116 dataset_size: 11260724 - config_name: 2020-08 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 10871233 num_examples: 627 download_size: 3431593 dataset_size: 10871233 - config_name: 2020-09 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 31711385 num_examples: 521 download_size: 8342950 dataset_size: 31711385 - config_name: 2020-10 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 12508833 num_examples: 613 download_size: 3741252 dataset_size: 12508833 - config_name: 2020-11 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 11227959 num_examples: 677 download_size: 3230957 dataset_size: 11227959 - config_name: 2020-12 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 11118488 num_examples: 640 download_size: 3401502 dataset_size: 11118488 - config_name: 2021-01 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 17085054 num_examples: 621 download_size: 5321474 dataset_size: 17085054 - config_name: 2021-02 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 15458575 num_examples: 578 download_size: 4787808 dataset_size: 15458575 - config_name: 2021-03 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 7703720 num_examples: 653 download_size: 2426969 dataset_size: 7703720 - config_name: 2021-04 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 15037278 num_examples: 678 download_size: 4548380 dataset_size: 15037278 - config_name: 2021-05 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 15187401 num_examples: 591 download_size: 5398432 dataset_size: 15187401 - config_name: 2021-06 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 13920465 num_examples: 706 download_size: 4550436 dataset_size: 13920465 - config_name: 2021-07 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 7601824 num_examples: 543 download_size: 2361359 dataset_size: 7601824 - config_name: 2021-08 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 7945161 num_examples: 502 download_size: 2397710 dataset_size: 7945161 - config_name: 2021-09 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 7577437 num_examples: 551 download_size: 2325651 dataset_size: 7577437 - config_name: 2021-10 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 12530205 num_examples: 634 download_size: 3259435 dataset_size: 12530205 - config_name: 2021-11 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 14472788 num_examples: 547 download_size: 4711471 dataset_size: 14472788 - config_name: 2021-12 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 5200434 num_examples: 467 download_size: 1527070 dataset_size: 5200434 - config_name: 2022-01 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 19748357 num_examples: 670 download_size: 6406111 dataset_size: 19748357 - config_name: 2022-02 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 10564005 num_examples: 530 download_size: 2942060 dataset_size: 10564005 - config_name: 2022-03 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 8330402 num_examples: 555 download_size: 2711949 dataset_size: 8330402 - config_name: 2022-04 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 15195730 num_examples: 505 download_size: 4886429 dataset_size: 15195730 - config_name: 2022-05 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 15480499 num_examples: 608 download_size: 4705460 dataset_size: 15480499 - config_name: 2022-06 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 5398707 num_examples: 497 download_size: 1648305 dataset_size: 5398707 - config_name: 2022-07 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 50537703 num_examples: 435 download_size: 8108640 dataset_size: 50537703 - config_name: 2022-08 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 11369482 num_examples: 501 download_size: 3233652 dataset_size: 11369482 - config_name: 2022-09 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 8362040 num_examples: 590 download_size: 2797011 dataset_size: 8362040 - config_name: 2022-10 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 68650727 num_examples: 658 download_size: 11446155 dataset_size: 68650727 - config_name: 2022-11 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 12827870 num_examples: 554 download_size: 3769127 dataset_size: 12827870 - config_name: 2022-12 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 5062252 num_examples: 405 download_size: 1542956 dataset_size: 5062252 - config_name: 2023-01 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 7424247 num_examples: 524 download_size: 2280205 dataset_size: 7424247 - config_name: 2023-02 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 14611475 num_examples: 651 download_size: 4553715 dataset_size: 14611475 - config_name: 2023-03 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 9090842 num_examples: 554 download_size: 3053667 dataset_size: 9090842 - config_name: 2023-04 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 9109711 num_examples: 655 download_size: 2983998 dataset_size: 9109711 - config_name: 2023-05 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 12026115 num_examples: 700 download_size: 3705822 dataset_size: 12026115 - config_name: 2023-06 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 9236299 num_examples: 610 download_size: 3095700 dataset_size: 9236299 - config_name: 2023-07 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 19154420 num_examples: 564 download_size: 6664885 dataset_size: 19154420 - config_name: 2023-08 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 14808989 num_examples: 660 download_size: 4907177 dataset_size: 14808989 - config_name: 2023-09 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 9725777 num_examples: 685 download_size: 3242584 dataset_size: 9725777 - config_name: 2023-10 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 7385805 num_examples: 530 download_size: 2558675 dataset_size: 7385805 - config_name: 2023-11 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 6461718 num_examples: 491 download_size: 1851460 dataset_size: 6461718 - config_name: 2023-12 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 11708230 num_examples: 532 download_size: 3078359 dataset_size: 11708230 - config_name: 2024-01 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 6350493 num_examples: 529 download_size: 1876549 dataset_size: 6350493 - config_name: 2024-02 features: - name: file_path dtype: string - name: num_changed_lines dtype: int64 - name: code dtype: string - name: repo_name dtype: string - name: commit_date dtype: string - name: sha dtype: string splits: - name: train num_bytes: 9488240 num_examples: 571 download_size: 3024420 dataset_size: 9488240 - config_name: 2024-03 features: [] splits: - name: train num_bytes: 0 num_examples: 0 download_size: 324 dataset_size: 0 configs: - config_name: 2017-01 data_files: - split: train path: 2017-01/train-* - config_name: 2017-02 data_files: - split: train path: 2017-02/train-* - config_name: 2017-03 data_files: - split: train path: 2017-03/train-* - config_name: 2017-04 data_files: - split: train path: 2017-04/train-* - config_name: 2017-05 data_files: - split: train path: 2017-05/train-* - config_name: 2017-06 data_files: - split: train path: 2017-06/train-* - config_name: 2017-07 data_files: - split: train path: 2017-07/train-* - config_name: 2017-08 data_files: - split: train path: 2017-08/train-* - config_name: 2017-09 data_files: - split: train path: 2017-09/train-* - config_name: 2017-10 data_files: - split: train path: 2017-10/train-* - config_name: 2017-11 data_files: - split: train path: 2017-11/train-* - config_name: 2017-12 data_files: - split: train path: 2017-12/train-* - config_name: 2018-01 data_files: - split: train path: 2018-01/train-* - config_name: 2018-02 data_files: - split: train path: 2018-02/train-* - config_name: 2018-03 data_files: - split: train path: 2018-03/train-* - config_name: 2018-04 data_files: - split: train path: 2018-04/train-* - config_name: 2018-05 data_files: - split: train path: 2018-05/train-* - config_name: 2018-06 data_files: - split: train path: 2018-06/train-* - config_name: 2018-07 data_files: - split: train path: 2018-07/train-* - config_name: 2018-08 data_files: - split: train path: 2018-08/train-* - config_name: 2018-09 data_files: - split: train path: 2018-09/train-* - config_name: 2018-10 data_files: - split: train path: 2018-10/train-* - config_name: 2018-11 data_files: - split: train path: 2018-11/train-* - config_name: 2018-12 data_files: - split: train path: 2018-12/train-* - config_name: 2019-01 data_files: - split: train path: 2019-01/train-* - config_name: 2019-02 data_files: - split: train path: 2019-02/train-* - config_name: 2019-03 data_files: - split: train path: 2019-03/train-* - config_name: 2019-04 data_files: - split: train path: 2019-04/train-* - config_name: 2019-05 data_files: - split: train path: 2019-05/train-* - config_name: 2019-06 data_files: - split: train path: 2019-06/train-* - config_name: 2019-07 data_files: - split: train path: 2019-07/train-* - config_name: 2019-08 data_files: - split: train path: 2019-08/train-* - config_name: 2019-09 data_files: - split: train path: 2019-09/train-* - config_name: 2019-10 data_files: - split: train path: 2019-10/train-* - config_name: 2019-11 data_files: - split: train path: 2019-11/train-* - config_name: 2019-12 data_files: - split: train path: 2019-12/train-* - config_name: 2020-01 data_files: - split: train path: 2020-01/train-* - config_name: 2020-02 data_files: - split: train path: 2020-02/train-* - config_name: 2020-03 data_files: - split: train path: 2020-03/train-* - config_name: 2020-04 data_files: - split: train path: 2020-04/train-* - config_name: 2020-05 data_files: - split: train path: 2020-05/train-* - config_name: 2020-06 data_files: - split: train path: 2020-06/train-* - config_name: 2020-07 data_files: - split: train path: 2020-07/train-* - config_name: 2020-08 data_files: - split: train path: 2020-08/train-* - config_name: 2020-09 data_files: - split: train path: 2020-09/train-* - config_name: 2020-10 data_files: - split: train path: 2020-10/train-* - config_name: 2020-11 data_files: - split: train path: 2020-11/train-* - config_name: 2020-12 data_files: - split: train path: 2020-12/train-* - config_name: 2021-01 data_files: - split: train path: 2021-01/train-* - config_name: 2021-02 data_files: - split: train path: 2021-02/train-* - config_name: 2021-03 data_files: - split: train path: 2021-03/train-* - config_name: 2021-04 data_files: - split: train path: 2021-04/train-* - config_name: 2021-05 data_files: - split: train path: 2021-05/train-* - config_name: 2021-06 data_files: - split: train path: 2021-06/train-* - config_name: 2021-07 data_files: - split: train path: 2021-07/train-* - config_name: 2021-08 data_files: - split: train path: 2021-08/train-* - config_name: 2021-09 data_files: - split: train path: 2021-09/train-* - config_name: 2021-10 data_files: - split: train path: 2021-10/train-* - config_name: 2021-11 data_files: - split: train path: 2021-11/train-* - config_name: 2021-12 data_files: - split: train path: 2021-12/train-* - config_name: 2022-01 data_files: - split: train path: 2022-01/train-* - config_name: 2022-02 data_files: - split: train path: 2022-02/train-* - config_name: 2022-03 data_files: - split: train path: 2022-03/train-* - config_name: 2022-04 data_files: - split: train path: 2022-04/train-* - config_name: 2022-05 data_files: - split: train path: 2022-05/train-* - config_name: 2022-06 data_files: - split: train path: 2022-06/train-* - config_name: 2022-07 data_files: - split: train path: 2022-07/train-* - config_name: 2022-08 data_files: - split: train path: 2022-08/train-* - config_name: 2022-09 data_files: - split: train path: 2022-09/train-* - config_name: 2022-10 data_files: - split: train path: 2022-10/train-* - config_name: 2022-11 data_files: - split: train path: 2022-11/train-* - config_name: 2022-12 data_files: - split: train path: 2022-12/train-* - config_name: 2023-01 data_files: - split: train path: 2023-01/train-* - config_name: 2023-02 data_files: - split: train path: 2023-02/train-* - config_name: 2023-03 data_files: - split: train path: 2023-03/train-* - config_name: 2023-04 data_files: - split: train path: 2023-04/train-* - config_name: 2023-05 data_files: - split: train path: 2023-05/train-* - config_name: 2023-06 data_files: - split: train path: 2023-06/train-* - config_name: 2023-07 data_files: - split: train path: 2023-07/train-* - config_name: 2023-08 data_files: - split: train path: 2023-08/train-* - config_name: 2023-09 data_files: - split: train path: 2023-09/train-* - config_name: 2023-10 data_files: - split: train path: 2023-10/train-* - config_name: 2023-11 data_files: - split: train path: 2023-11/train-* - config_name: 2023-12 data_files: - split: train path: 2023-12/train-* - config_name: 2024-01 data_files: - split: train path: 2024-01/train-* - config_name: 2024-02 data_files: - split: train path: 2024-02/train-* - config_name: 2024-03 data_files: - split: train path: 2024-03/train-* ---
pytorch-survival/metabric_pycox
--- dataset_info: features: - name: x0 dtype: float32 - name: x1 dtype: float32 - name: x2 dtype: float32 - name: x3 dtype: float32 - name: x4 dtype: float32 - name: x5 dtype: float32 - name: x6 dtype: float32 - name: x7 dtype: float32 - name: x8 dtype: float32 - name: event_time dtype: float32 - name: event_indicator dtype: int32 splits: - name: train num_bytes: 83776 num_examples: 1904 download_size: 68030 dataset_size: 83776 --- # Dataset Card for "metabric_pycox" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp
--- pretty_name: Evaluation run of superlazycoder/NeuralPipe-7B-slerp dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [superlazycoder/NeuralPipe-7B-slerp](https://huggingface.co/superlazycoder/NeuralPipe-7B-slerp)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-13T16:47:37.959217](https://huggingface.co/datasets/open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp/blob/main/results_2024-01-13T16-47-37.959217.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6445269708058093,\n\ \ \"acc_stderr\": 0.03218714474134609,\n \"acc_norm\": 0.6449418405596148,\n\ \ \"acc_norm_stderr\": 0.03284511879516387,\n \"mc1\": 0.4283965728274174,\n\ \ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n\ \ \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598675,\n\ \ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518829\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6701852220673172,\n\ \ \"acc_stderr\": 0.0046918486653990685,\n \"acc_norm\": 0.8616809400517825,\n\ \ \"acc_norm_stderr\": 0.003445289925011734\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\ \ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\ \ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\ \ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\ \ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \ \ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\ \ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n\ \ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\ : {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n\ \ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n\ \ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\ \ \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n\ \ \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n\ \ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n\ \ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n \ \ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\ : 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"\ acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\ acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\ \ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\ \ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\ acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\ acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\ acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\ \ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\ \ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \ \ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\ acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"\ acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\ acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\ acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \ \ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\ \ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\ acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\ \ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\ \ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\ \ \"acc_stderr\": 0.013265346261323793,\n \"acc_norm\": 0.8352490421455939,\n\ \ \"acc_norm_stderr\": 0.013265346261323793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\ \ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\ \ \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n\ \ \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\ \ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\ \ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\ \ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\ \ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \ \ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\ \ \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n\ \ \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\ \ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\ \ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\ \ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\ \ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\ \ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\ \ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\ \ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n\ \ \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6823351023502654,\n \ \ \"acc_stderr\": 0.012824066621488845\n }\n}\n```" repo_url: https://huggingface.co/superlazycoder/NeuralPipe-7B-slerp leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|arc:challenge|25_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-13T16-47-37.959217.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|gsm8k|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hellaswag|10_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-13T16-47-37.959217.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-management|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T16-47-37.959217.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|truthfulqa:mc|0_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-13T16-47-37.959217.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_13T16_47_37.959217 path: - '**/details_harness|winogrande|5_2024-01-13T16-47-37.959217.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-13T16-47-37.959217.parquet' - config_name: results data_files: - split: 2024_01_13T16_47_37.959217 path: - results_2024-01-13T16-47-37.959217.parquet - split: latest path: - results_2024-01-13T16-47-37.959217.parquet --- # Dataset Card for Evaluation run of superlazycoder/NeuralPipe-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [superlazycoder/NeuralPipe-7B-slerp](https://huggingface.co/superlazycoder/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-13T16:47:37.959217](https://huggingface.co/datasets/open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp/blob/main/results_2024-01-13T16-47-37.959217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6445269708058093, "acc_stderr": 0.03218714474134609, "acc_norm": 0.6449418405596148, "acc_norm_stderr": 0.03284511879516387, "mc1": 0.4283965728274174, "mc1_stderr": 0.017323088597314754, "mc2": 0.598408044881861, "mc2_stderr": 0.015149948573522944 }, "harness|arc:challenge|25": { "acc": 0.6476109215017065, "acc_stderr": 0.013960142600598675, "acc_norm": 0.6757679180887372, "acc_norm_stderr": 0.013678810399518829 }, "harness|hellaswag|10": { "acc": 0.6701852220673172, "acc_stderr": 0.0046918486653990685, "acc_norm": 0.8616809400517825, "acc_norm_stderr": 0.003445289925011734 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107224, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107224 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586818, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586818 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603346, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603346 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6564102564102564, "acc_stderr": 0.024078696580635477, "acc_norm": 0.6564102564102564, "acc_norm_stderr": 0.024078696580635477 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.03006676158297793, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.03006676158297793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8550458715596331, "acc_stderr": 0.01509421569970048, "acc_norm": 0.8550458715596331, "acc_norm_stderr": 0.01509421569970048 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.027044621719474082, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.027044621719474082 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.0257449025322909, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.0257449025322909 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313729, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313729 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841407, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841407 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8352490421455939, "acc_stderr": 0.013265346261323793, "acc_norm": 0.8352490421455939, "acc_norm_stderr": 0.013265346261323793 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.023948512905468365, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.023948512905468365 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.36312849162011174, "acc_stderr": 0.016083749986853697, "acc_norm": 0.36312849162011174, "acc_norm_stderr": 0.016083749986853697 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7450980392156863, "acc_stderr": 0.02495418432487991, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.02495418432487991 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4726205997392438, "acc_stderr": 0.012751075788015058, "acc_norm": 0.4726205997392438, "acc_norm_stderr": 0.012751075788015058 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.027971541370170598, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.027971541370170598 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6764705882352942, "acc_stderr": 0.018926082916083383, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.018926082916083383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.4283965728274174, "mc1_stderr": 0.017323088597314754, "mc2": 0.598408044881861, "mc2_stderr": 0.015149948573522944 }, "harness|winogrande|5": { "acc": 0.8018942383583267, "acc_stderr": 0.01120186274448705 }, "harness|gsm8k|5": { "acc": 0.6823351023502654, "acc_stderr": 0.012824066621488845 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
ruanchaves/hatebr_por_Latn_to_glg_Latn
--- dataset_info: features: - name: instagram_comments dtype: string - name: offensive_language dtype: bool - name: offensiveness_levels dtype: int32 - name: antisemitism dtype: bool - name: apology_for_the_dictatorship dtype: bool - name: fatphobia dtype: bool - name: homophobia dtype: bool - name: partyism dtype: bool - name: racism dtype: bool - name: religious_intolerance dtype: bool - name: sexism dtype: bool - name: xenophobia dtype: bool - name: offensive_&_non-hate_speech dtype: bool - name: non-offensive dtype: bool - name: specialist_1_hate_speech dtype: bool - name: specialist_2_hate_speech dtype: bool - name: specialist_3_hate_speech dtype: bool splits: - name: train num_bytes: 366154 num_examples: 4480 - name: validation num_bytes: 82771 num_examples: 1120 - name: test num_bytes: 98956 num_examples: 1400 download_size: 0 dataset_size: 547881 --- # Dataset Card for "hatebr_por_Latn_to_glg_Latn" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
gblazex/models-text-generation-popular-PRIVATE
--- license: mit ---
BeIR/fiqa-generated-queries
--- annotations_creators: [] language_creators: [] language: - en license: - cc-by-sa-4.0 multilinguality: - monolingual paperswithcode_id: beir pretty_name: BEIR Benchmark size_categories: msmarco: - 1M<n<10M trec-covid: - 100k<n<1M nfcorpus: - 1K<n<10K nq: - 1M<n<10M hotpotqa: - 1M<n<10M fiqa: - 10K<n<100K arguana: - 1K<n<10K touche-2020: - 100K<n<1M cqadupstack: - 100K<n<1M quora: - 100K<n<1M dbpedia: - 1M<n<10M scidocs: - 10K<n<100K fever: - 1M<n<10M climate-fever: - 1M<n<10M scifact: - 1K<n<10K source_datasets: [] task_categories: - text-retrieval - zero-shot-retrieval - information-retrieval - zero-shot-information-retrieval task_ids: - passage-retrieval - entity-linking-retrieval - fact-checking-retrieval - tweet-retrieval - citation-prediction-retrieval - duplication-question-retrieval - argument-retrieval - news-retrieval - biomedical-information-retrieval - question-answering-retrieval --- # Dataset Card for BEIR Benchmark ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://github.com/UKPLab/beir - **Repository:** https://github.com/UKPLab/beir - **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ - **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns - **Point of Contact:** nandan.thakur@uwaterloo.ca ### Dataset Summary BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks: - Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact) - Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/) - Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) - News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html) - Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data) - Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) - Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs) - Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html) - Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/) All these datasets have been preprocessed and can be used for your experiments. ```python ``` ### Supported Tasks and Leaderboards The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia. The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/). ### Languages All tasks are in English (`en`). ## Dataset Structure All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format: - `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}` - `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}` - `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1` ### Data Instances A high level example of any beir dataset: ```python corpus = { "doc1" : { "title": "Albert Einstein", "text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \ one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \ its influence on the philosophy of science. He is best known to the general public for his mass–energy \ equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \ Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \ of the photoelectric effect', a pivotal step in the development of quantum theory." }, "doc2" : { "title": "", # Keep title an empty string if not present "text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \ malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\ with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)." }, } queries = { "q1" : "Who developed the mass-energy equivalence formula?", "q2" : "Which beer is brewed with a large proportion of wheat?" } qrels = { "q1" : {"doc1": 1}, "q2" : {"doc2": 1}, } ``` ### Data Fields Examples from all configurations have the following features: ### Corpus - `corpus`: a `dict` feature representing the document title and passage text, made up of: - `_id`: a `string` feature representing the unique document id - `title`: a `string` feature, denoting the title of the document. - `text`: a `string` feature, denoting the text of the document. ### Queries - `queries`: a `dict` feature representing the query, made up of: - `_id`: a `string` feature representing the unique query id - `text`: a `string` feature, denoting the text of the query. ### Qrels - `qrels`: a `dict` feature representing the query document relevance judgements, made up of: - `_id`: a `string` feature representing the query id - `_id`: a `string` feature, denoting the document id. - `score`: a `int32` feature, denoting the relevance judgement between query and document. ### Data Splits | Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 | | -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:| | MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` | | TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` | | NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` | | BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) | | NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` | | HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` | | FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` | | Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) | | TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) | | ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` | | Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` | | CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` | | Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` | | DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` | | SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` | | FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` | | Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` | | SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` | | Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) | ## Dataset Creation ### Curation Rationale [Needs More Information] ### Source Data #### Initial Data Collection and Normalization [Needs More Information] #### Who are the source language producers? [Needs More Information] ### Annotations #### Annotation process [Needs More Information] #### Who are the annotators? [Needs More Information] ### Personal and Sensitive Information [Needs More Information] ## Considerations for Using the Data ### Social Impact of Dataset [Needs More Information] ### Discussion of Biases [Needs More Information] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators [Needs More Information] ### Licensing Information [Needs More Information] ### Citation Information Cite as: ``` @inproceedings{ thakur2021beir, title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models}, author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych}, booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)}, year={2021}, url={https://openreview.net/forum?id=wCu6T5xFjeJ} } ``` ### Contributions Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.
clarin-knext/cen
--- annotations_creators: - expert-generated language_creators: - found language: - pl license: - cc-by-3.0 multilinguality: - monolingual pretty_name: 'KPWr 1.27' size_categories: - 18K - 10K<n<100K source_datasets: - original task_categories: - token-classification task_ids: - named-entity-recognition --- # CEN
aimyonnnna/customhkcode2
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 5826 num_examples: 39 download_size: 2572 dataset_size: 5826 configs: - config_name: default data_files: - split: train path: data/train-* ---
ssbuild/alpaca_finance_en
--- license: apache-2.0 ---
open-llm-leaderboard/details_alnrg2arg__blockchainlabs_test3_seminar
--- pretty_name: Evaluation run of alnrg2arg/blockchainlabs_test3_seminar dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [alnrg2arg/blockchainlabs_test3_seminar](https://huggingface.co/alnrg2arg/blockchainlabs_test3_seminar)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__blockchainlabs_test3_seminar\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-02T04:30:24.941518](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_test3_seminar/blob/main/results_2024-02-02T04-30-24.941518.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6526679268733767,\n\ \ \"acc_stderr\": 0.03208915302774204,\n \"acc_norm\": 0.6516694604557469,\n\ \ \"acc_norm_stderr\": 0.032768893712299095,\n \"mc1\": 0.5716034271725826,\n\ \ \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7247121699417279,\n\ \ \"mc2_stderr\": 0.01469874984195087\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7039249146757679,\n \"acc_stderr\": 0.013340916085246256,\n\ \ \"acc_norm\": 0.7218430034129693,\n \"acc_norm_stderr\": 0.013094469919538809\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.711611232822147,\n\ \ \"acc_stderr\": 0.004520870679457038,\n \"acc_norm\": 0.8893646683927504,\n\ \ \"acc_norm_stderr\": 0.0031303894668331987\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\ \ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\ \ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\ \ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\ \ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\ \ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\ \ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\ \ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\ \ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\ \ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\ \ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\ \ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"\ acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\ \ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\ \ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\ \ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\ \ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\ \ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\ : 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\ : 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\ \ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\ \ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \ \ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\ acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"\ acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\ acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\ acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \ \ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\ \ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\ \ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\ \ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\ \ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\ \ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\ \ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\ \ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\ \ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\ \ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\ \ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\ \ \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n\ \ \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508297,\n\ \ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508297\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\ \ \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n\ \ \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781873,\n\ \ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781873\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\ \ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\ \ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\ \ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \ \ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\ \ \"acc_stderr\": 0.01274920600765747,\n \"acc_norm\": 0.47131681877444587,\n\ \ \"acc_norm_stderr\": 0.01274920600765747\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \ \ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\ \ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\ \ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\ \ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\ \ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n\ \ \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7247121699417279,\n\ \ \"mc2_stderr\": 0.01469874984195087\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.856353591160221,\n \"acc_stderr\": 0.009857280052696737\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \ \ \"acc_stderr\": 0.012579398235589534\n }\n}\n```" repo_url: https://huggingface.co/alnrg2arg/blockchainlabs_test3_seminar leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|arc:challenge|25_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-02T04-30-24.941518.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|gsm8k|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hellaswag|10_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-30-24.941518.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-management|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-30-24.941518.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|truthfulqa:mc|0_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-02T04-30-24.941518.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_02T04_30_24.941518 path: - '**/details_harness|winogrande|5_2024-02-02T04-30-24.941518.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-02T04-30-24.941518.parquet' - config_name: results data_files: - split: 2024_02_02T04_30_24.941518 path: - results_2024-02-02T04-30-24.941518.parquet - split: latest path: - results_2024-02-02T04-30-24.941518.parquet --- # Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_test3_seminar <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [alnrg2arg/blockchainlabs_test3_seminar](https://huggingface.co/alnrg2arg/blockchainlabs_test3_seminar) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_alnrg2arg__blockchainlabs_test3_seminar", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T04:30:24.941518](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_test3_seminar/blob/main/results_2024-02-02T04-30-24.941518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6526679268733767, "acc_stderr": 0.03208915302774204, "acc_norm": 0.6516694604557469, "acc_norm_stderr": 0.032768893712299095, "mc1": 0.5716034271725826, "mc1_stderr": 0.017323088597314743, "mc2": 0.7247121699417279, "mc2_stderr": 0.01469874984195087 }, "harness|arc:challenge|25": { "acc": 0.7039249146757679, "acc_stderr": 0.013340916085246256, "acc_norm": 0.7218430034129693, "acc_norm_stderr": 0.013094469919538809 }, "harness|hellaswag|10": { "acc": 0.711611232822147, "acc_stderr": 0.004520870679457038, "acc_norm": 0.8893646683927504, "acc_norm_stderr": 0.0031303894668331987 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.028049186315695255, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.028049186315695255 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43386243386243384, "acc_stderr": 0.025525034382474887, "acc_norm": 0.43386243386243384, "acc_norm_stderr": 0.025525034382474887 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8181818181818182, "acc_stderr": 0.0274796030105388, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.0274796030105388 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6564102564102564, "acc_stderr": 0.024078696580635477, "acc_norm": 0.6564102564102564, "acc_norm_stderr": 0.024078696580635477 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34814814814814815, "acc_stderr": 0.029045600290616255, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.029045600290616255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886793, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374307, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374307 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094632, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094632 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993466, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993466 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.023786203255508297, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.023786203255508297 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4223463687150838, "acc_stderr": 0.016519594275297117, "acc_norm": 0.4223463687150838, "acc_norm_stderr": 0.016519594275297117 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.02573885479781873, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.02573885479781873 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712992, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712992 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.02977945095730307, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.02977945095730307 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.01274920600765747, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.01274920600765747 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.02841820861940676, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.02841820861940676 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5716034271725826, "mc1_stderr": 0.017323088597314743, "mc2": 0.7247121699417279, "mc2_stderr": 0.01469874984195087 }, "harness|winogrande|5": { "acc": 0.856353591160221, "acc_stderr": 0.009857280052696737 }, "harness|gsm8k|5": { "acc": 0.7035633055344959, "acc_stderr": 0.012579398235589534 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Jing24/new_sort_high_all_train
--- dataset_info: features: - name: id dtype: string - name: title dtype: string - name: context dtype: string - name: question dtype: string - name: answers struct: - name: answer_start sequence: int64 - name: text sequence: string splits: - name: train num_bytes: 79673257 num_examples: 87599 download_size: 32571429 dataset_size: 79673257 configs: - config_name: default data_files: - split: train path: data/train-* ---
BasSabretooth/amongstotherthings
--- license: other ---
nlp-with-deeplearning/Ko.HelpSteer
--- license: cc-by-nc-sa-4.0 size_categories: - 10K<n<100K language: - en - ko --- 원본 데이터셋: [nvidia/HelpSteer](https://huggingface.co/datasets/nvidia/HelpSteer)
andersonbcdefg/gpt35_triples_filtered
--- dataset_info: features: - name: task dtype: string - name: neg dtype: string - name: query dtype: string - name: pos dtype: string - name: margin dtype: float32 splits: - name: train num_bytes: 159353054.6377521 num_examples: 151649 download_size: 90263657 dataset_size: 159353054.6377521 configs: - config_name: default data_files: - split: train path: data/train-* ---
heliosprime/twitter_dataset_1713151666
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 4303 num_examples: 12 download_size: 9336 dataset_size: 4303 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "twitter_dataset_1713151666" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
agileloop/izaz-sequence-of-actions-prediction-dataset-llama2-7b-32k
--- dataset_info: features: - name: Instruction dtype: string - name: Response struct: - name: action_type dtype: string - name: target_element list: - name: attributes dtype: string - name: tag dtype: string splits: - name: train num_bytes: 281382092 num_examples: 10738 download_size: 39998989 dataset_size: 281382092 configs: - config_name: default data_files: - split: train path: data/train-* ---
Nicolas-BZRD/DILA_OPENDATA_FR_2023
--- license: odc-by configs: - config_name: default data_files: - split: acco path: data/acco/*.arrow - split: balo path: data/balo/*.arrow - split: capp path: data/capp/*.arrow - split: cass path: data/cass/*.arrow - split: cnil path: data/cnil/*.arrow - split: constit path: data/constit/*.arrow - split: debats path: data/debats/*.arrow - split: dole path: data/dole/*.arrow - split: inca path: data/inca/*.arrow - split: jade path: data/jade/*.arrow - split: jorf path: data/jorf/*.arrow - split: kali path: data/kali/*.arrow - split: legi path: data/legi/*.arrow - split: qr path: data/qr/*.arrow - split: sarde path: data/sarde/*.arrow task_categories: - text-classification - question-answering - text-generation language: - fr tags: - finance - legal size_categories: - 10M<n<100M pretty_name: French Government Open Data (DILA) Dataset - 2023 --- # French Government Open Data (DILA) Dataset - 2023 ## Overview The French Government Open Data (DILA) Dataset is a collection of text data extracted from various sources provided by the French government, specifically the Direction de l'information légale et administrative (DILA). This dataset contains a wide range of legal, administrative, and legislative documents. The data has been organized into several categories for easy access and analysis. ## Dataset Splits The dataset is organized into the following splits or categories: - acco: Legal documents related to accounting and finance. - balo: Documents related to the Bulletin des Annonces Légales Obligatoires (BALO), which publishes legal notices. - capp: Administrative documents related to public policies and planning. - cass: Documents related to the Cour de cassation (Court of Cassation), France's highest judicial court. - cnil: Documents related to the Commission nationale de l'informatique et des libertés (CNIL), which deals with data protection and privacy. - constit: Documents related to the French constitution and constitutional law. - debats: Transcripts of parliamentary debates and discussions. - dole: Documents related to employment and unemployment benefits. - inca: Documents related to the Institut National du Cancer (INCa), which deals with cancer research and policy. - jade: Legal documents related to jurisprudence and legal decisions. - jorf: Documents related to the Journal Officiel de la République Française (JORF), the official journal of the French government. - kali: Documents related to the Kali database, which contains collective agreements. - legi: Legal documents related to French legislation. - qr: Questions and answers related to parliamentary sessions. - sarde: Documents related to the Service d'administration des réseaux de l'État (SARDE), which manages government networks. ## Dataset Details Size: 25.65 GB (25 647 979 364 bytes)<br> Languages: French<br> Data Format: Plain text<br> License: OPEN LICENCE<br> Data Sources: https://echanges.dila.gouv.fr/OPENDATA/<br> Data Collection Date: October, 2023<br> Data Structure: Id, Text<br> - Id: A unique identifier for each document, consisting of the split name and the file name (split/file_name.txt). - Text: The main text content of the document. ## Acknowledgments We would like to acknowledge the French government and the Direction de l'information légale et administrative (DILA) for providing access to the data used in this dataset. ## License Information The French Government Open Data (DILA) Dataset is made available under the terms of the "LICENCE OUVERTE / OPEN LICENCE Version 2.0." LICENCE OUVERTE / OPEN LICENCE Version 2.0<br> License Name: LICENCE OUVERTE / OPEN LICENCE Version 2.0<br> License Text: The full text of the LICENCE OUVERTE / OPEN LICENCE Version 2.0 can be found [here](https://www.etalab.gouv.fr/wp-content/uploads/2017/04/ETALAB-Licence-Ouverte-v2.0.pdf) (in French).<br> Summary: This license allows you to: - Copy, modify, publish, translate, distribute, or otherwise exploit the data, in any medium, mode, or format, for any lawful purpose. - Acknowledge the source of the data by providing appropriate attribution when using the data. - Ensure that you do not use the data in a way that suggests any official status or endorsement by the French Government or the Direction de l'information légale et administrative (DILA). - Comply with the terms and conditions of the license. By using this dataset, you agree to comply with the terms and conditions specified in the LICENCE OUVERTE / OPEN LICENCE Version 2.0. For more details, please review the full text of the license provided at the link above.
cabrooks/levenshtein_filter_50k_wordpiece
--- license: openrail ---
open-llm-leaderboard/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B
--- pretty_name: Evaluation run of nbeerbower/slerp-bophades-truthy-math-mistral-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [nbeerbower/slerp-bophades-truthy-math-mistral-7B](https://huggingface.co/nbeerbower/slerp-bophades-truthy-math-mistral-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-09T00:19:46.142948](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B/blob/main/results_2024-04-09T00-19-46.142948.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533577677120704,\n\ \ \"acc_stderr\": 0.0321090974841392,\n \"acc_norm\": 0.6524581392335448,\n\ \ \"acc_norm_stderr\": 0.032786891825831214,\n \"mc1\": 0.6242350061199511,\n\ \ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7782437262946236,\n\ \ \"mc2_stderr\": 0.0137879523668123\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n\ \ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7197769368651663,\n\ \ \"acc_stderr\": 0.004481902637505652,\n \"acc_norm\": 0.8916550487950607,\n\ \ \"acc_norm_stderr\": 0.0031018035745563107\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\ \ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\ \ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\ \ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\ \ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\ \ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\ \ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\ \ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\ \ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\ \ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"\ acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\ \ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\ \ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\ \ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\ \ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\ \ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\ : 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\ acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\ \ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \ \ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \ \ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\ acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\ acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\ acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\ acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \ \ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\ \ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\ \ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\ \ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\ \ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\ \ \"acc_stderr\": 0.01362555690799347,\n \"acc_norm\": 0.8237547892720306,\n\ \ \"acc_norm_stderr\": 0.01362555690799347\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\ \ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\ \ \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n\ \ \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\ \ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\ \ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\ \ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\ \ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \ \ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n\ \ \"acc_stderr\": 0.012758410941038913,\n \"acc_norm\": 0.4784876140808344,\n\ \ \"acc_norm_stderr\": 0.012758410941038913\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\ \ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \ \ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\ \ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\ \ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\ \ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\ \ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\ \ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6242350061199511,\n\ \ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7782437262946236,\n\ \ \"mc2_stderr\": 0.0137879523668123\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8539857932123125,\n \"acc_stderr\": 0.009924440374585244\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \ \ \"acc_stderr\": 0.012714401009923647\n }\n}\n```" repo_url: https://huggingface.co/nbeerbower/slerp-bophades-truthy-math-mistral-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|arc:challenge|25_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-09T00-19-46.142948.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|gsm8k|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hellaswag|10_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-19-46.142948.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-management|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-19-46.142948.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|truthfulqa:mc|0_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-09T00-19-46.142948.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_09T00_19_46.142948 path: - '**/details_harness|winogrande|5_2024-04-09T00-19-46.142948.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-09T00-19-46.142948.parquet' - config_name: results data_files: - split: 2024_04_09T00_19_46.142948 path: - results_2024-04-09T00-19-46.142948.parquet - split: latest path: - results_2024-04-09T00-19-46.142948.parquet --- # Dataset Card for Evaluation run of nbeerbower/slerp-bophades-truthy-math-mistral-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [nbeerbower/slerp-bophades-truthy-math-mistral-7B](https://huggingface.co/nbeerbower/slerp-bophades-truthy-math-mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-09T00:19:46.142948](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B/blob/main/results_2024-04-09T00-19-46.142948.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6533577677120704, "acc_stderr": 0.0321090974841392, "acc_norm": 0.6524581392335448, "acc_norm_stderr": 0.032786891825831214, "mc1": 0.6242350061199511, "mc1_stderr": 0.01695458406021429, "mc2": 0.7782437262946236, "mc2_stderr": 0.0137879523668123 }, "harness|arc:challenge|25": { "acc": 0.7141638225255973, "acc_stderr": 0.013203196088537372, "acc_norm": 0.7286689419795221, "acc_norm_stderr": 0.012993807727545796 }, "harness|hellaswag|10": { "acc": 0.7197769368651663, "acc_stderr": 0.004481902637505652, "acc_norm": 0.8916550487950607, "acc_norm_stderr": 0.0031018035745563107 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7916666666666666, "acc_stderr": 0.033961162058453336, "acc_norm": 0.7916666666666666, "acc_norm_stderr": 0.033961162058453336 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107224, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107224 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.0253795249107784, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.0253795249107784 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.02341529343356853, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.02341529343356853 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218967, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218967 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768763, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768763 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402534, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402534 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.028578348365473082, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.028578348365473082 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159463, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159463 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.01362555690799347, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.01362555690799347 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.024182427496577605, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.024182427496577605 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43798882681564244, "acc_stderr": 0.016593394227564843, "acc_norm": 0.43798882681564244, "acc_norm_stderr": 0.016593394227564843 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7124183006535948, "acc_stderr": 0.025917806117147158, "acc_norm": 0.7124183006535948, "acc_norm_stderr": 0.025917806117147158 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.024477222856135114, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4784876140808344, "acc_stderr": 0.012758410941038913, "acc_norm": 0.4784876140808344, "acc_norm_stderr": 0.012758410941038913 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462923, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462923 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6813725490196079, "acc_stderr": 0.01885008469646872, "acc_norm": 0.6813725490196079, "acc_norm_stderr": 0.01885008469646872 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514278, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514278 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.6242350061199511, "mc1_stderr": 0.01695458406021429, "mc2": 0.7782437262946236, "mc2_stderr": 0.0137879523668123 }, "harness|winogrande|5": { "acc": 0.8539857932123125, "acc_stderr": 0.009924440374585244 }, "harness|gsm8k|5": { "acc": 0.6921910538286581, "acc_stderr": 0.012714401009923647 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
CyberHarem/catherine_granbluefantasy
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of catherine (Granblue Fantasy) This is the dataset of catherine (Granblue Fantasy), containing 35 images and their tags. The core tags of this character are `animal_ears, long_hair, pink_hair, breasts, pink_eyes, hat, large_breasts, mini_hat, top_hat`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 35 | 35.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 35 | 24.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 67 | 45.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 35 | 34.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 67 | 59.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catherine_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/catherine_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, erune, glasses, cleavage, garter_straps, smile, tray, bare_shoulders, solo, alternate_costume, black_thighhighs, cup, holding, looking_at_viewer, ponytail, red_eyes, simple_background, skirt, teapot, green_apron, hand_on_hip, medium_breasts, very_long_hair | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, erune, solo, looking_at_viewer, smile, thighhighs, handgun, black_gloves, mini_top_hat, hairband, holding_gun, holster, blush, full_body, leotard, red_eyes, simple_background, sitting | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, erune, looking_at_viewer, simple_background, smile, solo, white_background, bangs, black_gloves, cleavage, elbow_gloves, hairband, leotard, mini_top_hat, parted_lips | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | glasses | cleavage | garter_straps | smile | tray | bare_shoulders | solo | alternate_costume | black_thighhighs | cup | holding | looking_at_viewer | ponytail | red_eyes | simple_background | skirt | teapot | green_apron | hand_on_hip | medium_breasts | very_long_hair | thighhighs | handgun | black_gloves | mini_top_hat | hairband | holding_gun | holster | blush | full_body | leotard | sitting | white_background | bangs | elbow_gloves | parted_lips | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------|:-----------|:----------------|:--------|:-------|:-----------------|:-------|:--------------------|:-------------------|:------|:----------|:--------------------|:-----------|:-----------|:--------------------|:--------|:---------|:--------------|:--------------|:-----------------|:-----------------|:-------------|:----------|:---------------|:---------------|:-----------|:--------------|:----------|:--------|:------------|:----------|:----------|:-------------------|:--------|:---------------|:--------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | | X | | | X | | | | | X | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | | X | | | X | | | | | X | | | X | | | | | | | | | X | X | X | | | | | X | | X | X | X | X |
dmayhem93/agieval-lsat-lr
--- dataset_info: features: - name: query dtype: string - name: choices sequence: string - name: gold sequence: int64 splits: - name: test num_bytes: 923886 num_examples: 510 download_size: 469904 dataset_size: 923886 license: mit --- # Dataset Card for "agieval-lsat-lr" Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo. Raw datset: https://github.com/zhongwanjun/AR-LSAT MIT License Copyright (c) 2022 Wanjun Zhong Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. @misc{zhong2023agieval, title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models}, author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan}, year={2023}, eprint={2304.06364}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{zhong2021arlsat, title={AR-LSAT: Investigating Analytical Reasoning of Text}, author={Wanjun Zhong and Siyuan Wang and Duyu Tang and Zenan Xu and Daya Guo and Jiahai Wang and Jian Yin and Ming Zhou and Nan Duan}, year={2021}, eprint={2104.06598}, archivePrefix={arXiv}, primaryClass={cs.CL} } @article{wang2022lsat, title={From lsat: The progress and challenges of complex reasoning}, author={Wang, Siyuan and Liu, Zhongkun and Zhong, Wanjun and Zhou, Ming and Wei, Zhongyu and Chen, Zhumin and Duan, Nan}, journal={IEEE/ACM Transactions on Audio, Speech, and Language Processing}, year={2022}, publisher={IEEE} }
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/83286fca
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 182 num_examples: 10 download_size: 1332 dataset_size: 182 --- # Dataset Card for "83286fca" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Circularmachines/batch_indexing_machine_230529_019
--- dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 171552814.0 num_examples: 720 download_size: 171566051 dataset_size: 171552814.0 --- # Dataset Card for "batch_indexing_machine_230529_019" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
MuhammadHelmy/nafsy
--- language: - ar size_categories: - 1K<n<10K task_categories: - text-generation - text-classification tags: - mental health - psychology dataset_info: features: - name: content dtype: string - name: text_size dtype: int64 - name: topic dtype: string - name: prob dtype: float64 splits: - name: train num_bytes: 6007437.514440433 num_examples: 1884 download_size: 2896563 dataset_size: 6007437.514440433 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for nafsy <!-- Provide a quick summary of the dataset. --> This arabic dataset is a set of mental health articles. The original dataset was scrapped from [Nafsy.net](https://nafsy.net/). ## Dataset Details **Language(s) (NLP):** Arabic ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> - Unsupervised Fine-tuning - RAG ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> Dataset Fields: - content: the articles - text_size: length of article - topic: top 10 words that describe the topics of the article - prob: topic prediction accuracy ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> Creating an arabic chatbot for mental health support. ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> - This dataset was originally scrapped from [Nafsy.net](https://nafsy.net/) then uploaded to Kaggle. - An additional preprocessing was made by this repo owner: - Cleaning data: removing urls, extra spaces, and non words, detach punctuations, and dropping duplicates - Applying Topic Modeling to generate main topics for each article using bert-base-arabic model - Deduplicating data using sentence-transformers (paraphrase-multilingual-MiniLM-L12-v2) #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [husamal](https://www.kaggle.com/husamal) ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** @misc{Husamal_2021, title={Arabic-physcology-dataset}, url={https://www.kaggle.com/datasets/husamal/arabicphyscologydataset?select=nafsy.csv}, journal={Kaggle}, author={Husamal}, year={2021}, month={May}} ## Dataset Card Authors Muhammad Helmy ## Dataset Card Contact muhammadhelmymmo@gmail.com
Ram07/text-csv-3
--- license: mit ---
epts/kanji-full
--- license: wtfpl ---
VedCodes/my_files
--- task_categories: - text-generation language: - en tags: - medical pretty_name: pretty_file size_categories: - n<1K --- # Dataset Card for Dataset Name ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). --- ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v3
--- pretty_name: Evaluation run of logicker/SkkuDS-DPO-72B-v3 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [logicker/SkkuDS-DPO-72B-v3](https://huggingface.co/logicker/SkkuDS-DPO-72B-v3)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v3\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-01T22:16:08.253878](https://huggingface.co/datasets/open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v3/blob/main/results_2024-03-01T22-16-08.253878.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7681004206920752,\n\ \ \"acc_stderr\": 0.027991870123942914,\n \"acc_norm\": 0.7729329467192335,\n\ \ \"acc_norm_stderr\": 0.028511620162705854,\n \"mc1\": 0.41982864137086906,\n\ \ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5972524285520616,\n\ \ \"mc2_stderr\": 0.014498261188889689\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844465,\n\ \ \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.013839039762820169\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6684923322047401,\n\ \ \"acc_stderr\": 0.004697929774670292,\n \"acc_norm\": 0.8610834495120494,\n\ \ \"acc_norm_stderr\": 0.003451525868724679\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n\ \ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n\ \ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n\ \ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\ \ \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \ \ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.023508739218846934,\n\ \ \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.023508739218846934\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n\ \ \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n\ \ \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\ : 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\ \ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n\ \ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n\ \ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n\ \ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.8042553191489362,\n \"acc_stderr\": 0.025937853139977148,\n\ \ \"acc_norm\": 0.8042553191489362,\n \"acc_norm_stderr\": 0.025937853139977148\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\ \ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.5877192982456141,\n\ \ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747549,\n\ \ \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747549\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.7142857142857143,\n \"acc_stderr\": 0.023266512213730557,\n \"\ acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.023266512213730557\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5952380952380952,\n\ \ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.5952380952380952,\n\ \ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432306,\n \"\ acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432306\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n \"\ acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\"\ : 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\ \ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723333,\n \"\ acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723333\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n\ \ \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.0195652367829309,\n \ \ \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.0195652367829309\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.5037037037037037,\n \"acc_stderr\": 0.03048470166508437,\n \ \ \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.03048470166508437\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n\ \ \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.5695364238410596,\n \"acc_stderr\": 0.04042809961395634,\n \"\ acc_norm\": 0.5695364238410596,\n \"acc_norm_stderr\": 0.04042809961395634\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.926605504587156,\n \"acc_stderr\": 0.011180976446357573,\n \"\ acc_norm\": 0.926605504587156,\n \"acc_norm_stderr\": 0.011180976446357573\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.6990740740740741,\n \"acc_stderr\": 0.03128039084329883,\n \"\ acc_norm\": 0.6990740740740741,\n \"acc_norm_stderr\": 0.03128039084329883\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"\ acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640273,\n \ \ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640273\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\ \ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\ \ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\ \ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.9090909090909091,\n \"acc_stderr\": 0.026243194054073892,\n \"\ acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.026243194054073892\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\ \ \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n\ \ \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n\ \ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n\ \ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.6517857142857143,\n\ \ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n\ \ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\ \ \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n\ \ \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9144316730523627,\n\ \ \"acc_stderr\": 0.010002965568647285,\n \"acc_norm\": 0.9144316730523627,\n\ \ \"acc_norm_stderr\": 0.010002965568647285\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442262,\n\ \ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442262\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6435754189944134,\n\ \ \"acc_stderr\": 0.01601823971051342,\n \"acc_norm\": 0.6435754189944134,\n\ \ \"acc_norm_stderr\": 0.01601823971051342\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.01989943546353996,\n\ \ \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.01989943546353996\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n\ \ \"acc_stderr\": 0.02135534302826405,\n \"acc_norm\": 0.8295819935691319,\n\ \ \"acc_norm_stderr\": 0.02135534302826405\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.01924252622654454,\n\ \ \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.01924252622654454\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.624113475177305,\n \"acc_stderr\": 0.028893955412115882,\n \ \ \"acc_norm\": 0.624113475177305,\n \"acc_norm_stderr\": 0.028893955412115882\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6134289439374185,\n\ \ \"acc_stderr\": 0.012437288868088727,\n \"acc_norm\": 0.6134289439374185,\n\ \ \"acc_norm_stderr\": 0.012437288868088727\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.023345163616544838,\n\ \ \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.023345163616544838\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.8088235294117647,\n \"acc_stderr\": 0.015908290136278067,\n \ \ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.015908290136278067\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\ \ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\ \ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\ \ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\ \ \"acc_stderr\": 0.022076326101824667,\n \"acc_norm\": 0.8905472636815921,\n\ \ \"acc_norm_stderr\": 0.022076326101824667\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594194,\n \ \ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594194\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\ \ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\ \ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136616,\n\ \ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136616\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41982864137086906,\n\ \ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5972524285520616,\n\ \ \"mc2_stderr\": 0.014498261188889689\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480330994\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6497346474601972,\n \ \ \"acc_stderr\": 0.013140409455571277\n }\n}\n```" repo_url: https://huggingface.co/logicker/SkkuDS-DPO-72B-v3 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|arc:challenge|25_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-01T22-16-08.253878.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|gsm8k|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hellaswag|10_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-01T22-16-08.253878.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-management|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T22-16-08.253878.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|truthfulqa:mc|0_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-01T22-16-08.253878.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_01T22_16_08.253878 path: - '**/details_harness|winogrande|5_2024-03-01T22-16-08.253878.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-01T22-16-08.253878.parquet' - config_name: results data_files: - split: 2024_03_01T22_16_08.253878 path: - results_2024-03-01T22-16-08.253878.parquet - split: latest path: - results_2024-03-01T22-16-08.253878.parquet --- # Dataset Card for Evaluation run of logicker/SkkuDS-DPO-72B-v3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [logicker/SkkuDS-DPO-72B-v3](https://huggingface.co/logicker/SkkuDS-DPO-72B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-01T22:16:08.253878](https://huggingface.co/datasets/open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v3/blob/main/results_2024-03-01T22-16-08.253878.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7681004206920752, "acc_stderr": 0.027991870123942914, "acc_norm": 0.7729329467192335, "acc_norm_stderr": 0.028511620162705854, "mc1": 0.41982864137086906, "mc1_stderr": 0.01727703030177577, "mc2": 0.5972524285520616, "mc2_stderr": 0.014498261188889689 }, "harness|arc:challenge|25": { "acc": 0.6279863481228669, "acc_stderr": 0.014124597881844465, "acc_norm": 0.6604095563139932, "acc_norm_stderr": 0.013839039762820169 }, "harness|hellaswag|10": { "acc": 0.6684923322047401, "acc_stderr": 0.004697929774670292, "acc_norm": 0.8610834495120494, "acc_norm_stderr": 0.003451525868724679 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7333333333333333, "acc_stderr": 0.038201699145179055, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.038201699145179055 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.881578947368421, "acc_stderr": 0.026293995855474928, "acc_norm": 0.881578947368421, "acc_norm_stderr": 0.026293995855474928 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.8, "acc_stderr": 0.04020151261036845, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8226415094339623, "acc_stderr": 0.023508739218846934, "acc_norm": 0.8226415094339623, "acc_norm_stderr": 0.023508739218846934 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9166666666666666, "acc_stderr": 0.023112508176051236, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.023112508176051236 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.65, "acc_stderr": 0.04793724854411019, "acc_norm": 0.65, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7514450867052023, "acc_stderr": 0.03295304696818317, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.03295304696818317 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5588235294117647, "acc_stderr": 0.049406356306056595, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.82, "acc_stderr": 0.03861229196653695, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653695 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.8042553191489362, "acc_stderr": 0.025937853139977148, "acc_norm": 0.8042553191489362, "acc_norm_stderr": 0.025937853139977148 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5877192982456141, "acc_stderr": 0.046306532033665956, "acc_norm": 0.5877192982456141, "acc_norm_stderr": 0.046306532033665956 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7862068965517242, "acc_stderr": 0.03416520447747549, "acc_norm": 0.7862068965517242, "acc_norm_stderr": 0.03416520447747549 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.7142857142857143, "acc_stderr": 0.023266512213730557, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.023266512213730557 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5952380952380952, "acc_stderr": 0.04390259265377563, "acc_norm": 0.5952380952380952, "acc_norm_stderr": 0.04390259265377563 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8838709677419355, "acc_stderr": 0.018225757949432306, "acc_norm": 0.8838709677419355, "acc_norm_stderr": 0.018225757949432306 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6600985221674877, "acc_stderr": 0.033327690684107895, "acc_norm": 0.6600985221674877, "acc_norm_stderr": 0.033327690684107895 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8545454545454545, "acc_stderr": 0.027530196355066573, "acc_norm": 0.8545454545454545, "acc_norm_stderr": 0.027530196355066573 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9343434343434344, "acc_stderr": 0.01764652667723333, "acc_norm": 0.9343434343434344, "acc_norm_stderr": 0.01764652667723333 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9896373056994818, "acc_stderr": 0.007308424386792194, "acc_norm": 0.9896373056994818, "acc_norm_stderr": 0.007308424386792194 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8179487179487179, "acc_stderr": 0.0195652367829309, "acc_norm": 0.8179487179487179, "acc_norm_stderr": 0.0195652367829309 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.5037037037037037, "acc_stderr": 0.03048470166508437, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.03048470166508437 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8445378151260504, "acc_stderr": 0.023536818625398904, "acc_norm": 0.8445378151260504, "acc_norm_stderr": 0.023536818625398904 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5695364238410596, "acc_stderr": 0.04042809961395634, "acc_norm": 0.5695364238410596, "acc_norm_stderr": 0.04042809961395634 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.926605504587156, "acc_stderr": 0.011180976446357573, "acc_norm": 0.926605504587156, "acc_norm_stderr": 0.011180976446357573 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6990740740740741, "acc_stderr": 0.03128039084329883, "acc_norm": 0.6990740740740741, "acc_norm_stderr": 0.03128039084329883 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9313725490196079, "acc_stderr": 0.017744453647073322, "acc_norm": 0.9313725490196079, "acc_norm_stderr": 0.017744453647073322 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9029535864978903, "acc_stderr": 0.019269323025640273, "acc_norm": 0.9029535864978903, "acc_norm_stderr": 0.019269323025640273 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7982062780269058, "acc_stderr": 0.026936111912802273, "acc_norm": 0.7982062780269058, "acc_norm_stderr": 0.026936111912802273 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8702290076335878, "acc_stderr": 0.029473649496907065, "acc_norm": 0.8702290076335878, "acc_norm_stderr": 0.029473649496907065 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9090909090909091, "acc_stderr": 0.026243194054073892, "acc_norm": 0.9090909090909091, "acc_norm_stderr": 0.026243194054073892 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8518518518518519, "acc_stderr": 0.03434300243630999, "acc_norm": 0.8518518518518519, "acc_norm_stderr": 0.03434300243630999 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8711656441717791, "acc_stderr": 0.02632138319878367, "acc_norm": 0.8711656441717791, "acc_norm_stderr": 0.02632138319878367 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6517857142857143, "acc_stderr": 0.04521829902833585, "acc_norm": 0.6517857142857143, "acc_norm_stderr": 0.04521829902833585 }, "harness|hendrycksTest-management|5": { "acc": 0.8640776699029126, "acc_stderr": 0.03393295729761011, "acc_norm": 0.8640776699029126, "acc_norm_stderr": 0.03393295729761011 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9401709401709402, "acc_stderr": 0.015537514263253874, "acc_norm": 0.9401709401709402, "acc_norm_stderr": 0.015537514263253874 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.85, "acc_stderr": 0.035887028128263734, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263734 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9144316730523627, "acc_stderr": 0.010002965568647285, "acc_norm": 0.9144316730523627, "acc_norm_stderr": 0.010002965568647285 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8352601156069365, "acc_stderr": 0.019971040982442262, "acc_norm": 0.8352601156069365, "acc_norm_stderr": 0.019971040982442262 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6435754189944134, "acc_stderr": 0.01601823971051342, "acc_norm": 0.6435754189944134, "acc_norm_stderr": 0.01601823971051342 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8594771241830066, "acc_stderr": 0.01989943546353996, "acc_norm": 0.8594771241830066, "acc_norm_stderr": 0.01989943546353996 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8295819935691319, "acc_stderr": 0.02135534302826405, "acc_norm": 0.8295819935691319, "acc_norm_stderr": 0.02135534302826405 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8611111111111112, "acc_stderr": 0.01924252622654454, "acc_norm": 0.8611111111111112, "acc_norm_stderr": 0.01924252622654454 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.624113475177305, "acc_stderr": 0.028893955412115882, "acc_norm": 0.624113475177305, "acc_norm_stderr": 0.028893955412115882 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6134289439374185, "acc_stderr": 0.012437288868088727, "acc_norm": 0.6134289439374185, "acc_norm_stderr": 0.012437288868088727 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8198529411764706, "acc_stderr": 0.023345163616544838, "acc_norm": 0.8198529411764706, "acc_norm_stderr": 0.023345163616544838 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8088235294117647, "acc_stderr": 0.015908290136278067, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.015908290136278067 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7363636363636363, "acc_stderr": 0.04220224692971987, "acc_norm": 0.7363636363636363, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8326530612244898, "acc_stderr": 0.02389714476891452, "acc_norm": 0.8326530612244898, "acc_norm_stderr": 0.02389714476891452 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8905472636815921, "acc_stderr": 0.022076326101824667, "acc_norm": 0.8905472636815921, "acc_norm_stderr": 0.022076326101824667 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.94, "acc_stderr": 0.023868325657594194, "acc_norm": 0.94, "acc_norm_stderr": 0.023868325657594194 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.03851597683718533, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.03851597683718533 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8830409356725146, "acc_stderr": 0.02464806896136616, "acc_norm": 0.8830409356725146, "acc_norm_stderr": 0.02464806896136616 }, "harness|truthfulqa:mc|0": { "mc1": 0.41982864137086906, "mc1_stderr": 0.01727703030177577, "mc2": 0.5972524285520616, "mc2_stderr": 0.014498261188889689 }, "harness|winogrande|5": { "acc": 0.8263614838200474, "acc_stderr": 0.010646116480330994 }, "harness|gsm8k|5": { "acc": 0.6497346474601972, "acc_stderr": 0.013140409455571277 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_98
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1247076628.0 num_examples: 244909 download_size: 1272986790 dataset_size: 1247076628.0 --- # Dataset Card for "chunk_98" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AlignmentResearch/EnronSpam
--- dataset_info: - config_name: default features: - name: text dtype: string - name: clf_label dtype: int64 - name: chunked_text sequence: string splits: - name: train num_bytes: 56981232 num_examples: 29567 - name: validation num_bytes: 3588410 num_examples: 1870 download_size: 37440525 dataset_size: 60569642 - config_name: neg features: - name: text dtype: string - name: clf_label dtype: int64 - name: chunked_text sequence: string splits: - name: train num_bytes: 27878733.118409038 num_examples: 14466 - name: validation num_bytes: 1773096.705882353 num_examples: 924 download_size: 17785441 dataset_size: 29651829.82429139 - config_name: pos features: - name: text dtype: string - name: clf_label dtype: int64 - name: chunked_text sequence: string splits: - name: train num_bytes: 29102498.881590962 num_examples: 15101 - name: validation num_bytes: 1815313.294117647 num_examples: 946 download_size: 18877626 dataset_size: 30917812.17570861 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - config_name: neg data_files: - split: train path: neg/train-* - split: validation path: neg/validation-* - config_name: pos data_files: - split: train path: pos/train-* - split: validation path: pos/validation-* ---
tyzhu/wiki_find_passage_train50_eval10_num
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: inputs dtype: string - name: targets dtype: string splits: - name: train num_bytes: 69886 num_examples: 110 - name: validation num_bytes: 6982 num_examples: 10 download_size: 37303 dataset_size: 76868 --- # Dataset Card for "wiki_find_passage_train50_eval10_num" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_TomGrc__FusionNet
--- pretty_name: Evaluation run of TomGrc/FusionNet dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [TomGrc/FusionNet](https://huggingface.co/TomGrc/FusionNet) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TomGrc__FusionNet\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-04T12:12:49.231518](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet/blob/main/results_2024-01-04T12-12-49.231518.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6672981908741114,\n\ \ \"acc_stderr\": 0.031616068911940555,\n \"acc_norm\": 0.6681680299548688,\n\ \ \"acc_norm_stderr\": 0.032258823353895884,\n \"mc1\": 0.5740514075887393,\n\ \ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7195314778980147,\n\ \ \"mc2_stderr\": 0.015001196424578202\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n\ \ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266125\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7133041226847242,\n\ \ \"acc_stderr\": 0.004512940497462742,\n \"acc_norm\": 0.8841864170483967,\n\ \ \"acc_norm_stderr\": 0.0031934725302821725\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \ \ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\ \ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\ \ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \ \ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\ \ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\"\ : 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\ \ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\ \ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\ \ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\ \ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4973544973544973,\n \"acc_stderr\": 0.02575094967813039,\n \"\ acc_norm\": 0.4973544973544973,\n \"acc_norm_stderr\": 0.02575094967813039\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\ \ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\ \ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\ \ \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n\ \ \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\ \ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\ : 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\ \ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\ acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\ \ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\ \ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \ \ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\ \ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\ acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"\ acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\ acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\ acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \ \ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\ \ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\ \ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n\ \ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\ acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\ \ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\ \ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\ \ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\ \ \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n\ \ \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\ \ \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n\ \ \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n\ \ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39217877094972065,\n\ \ \"acc_stderr\": 0.016329061073207446,\n \"acc_norm\": 0.39217877094972065,\n\ \ \"acc_norm_stderr\": 0.016329061073207446\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\ \ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\ \ \"acc_stderr\": 0.02521804037341062,\n \"acc_norm\": 0.729903536977492,\n\ \ \"acc_norm_stderr\": 0.02521804037341062\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n\ \ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \ \ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n\ \ \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n\ \ \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\ \ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\ \ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\ \ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\ \ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\ \ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\ \ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \ \ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\ \ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\ \ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\ \ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n\ \ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7195314778980147,\n\ \ \"mc2_stderr\": 0.015001196424578202\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6504927975739196,\n \ \ \"acc_stderr\": 0.013133836511705991\n }\n}\n```" repo_url: https://huggingface.co/TomGrc/FusionNet leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|arc:challenge|25_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-04T12-12-49.231518.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|gsm8k|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hellaswag|10_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-12-49.231518.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-management|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-12-49.231518.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|truthfulqa:mc|0_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-04T12-12-49.231518.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_04T12_12_49.231518 path: - '**/details_harness|winogrande|5_2024-01-04T12-12-49.231518.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-04T12-12-49.231518.parquet' - config_name: results data_files: - split: 2024_01_04T12_12_49.231518 path: - results_2024-01-04T12-12-49.231518.parquet - split: latest path: - results_2024-01-04T12-12-49.231518.parquet --- # Dataset Card for Evaluation run of TomGrc/FusionNet <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [TomGrc/FusionNet](https://huggingface.co/TomGrc/FusionNet) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TomGrc__FusionNet", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-04T12:12:49.231518](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet/blob/main/results_2024-01-04T12-12-49.231518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6672981908741114, "acc_stderr": 0.031616068911940555, "acc_norm": 0.6681680299548688, "acc_norm_stderr": 0.032258823353895884, "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.7195314778980147, "mc2_stderr": 0.015001196424578202 }, "harness|arc:challenge|25": { "acc": 0.6834470989761092, "acc_stderr": 0.013592431519068079, "acc_norm": 0.712457337883959, "acc_norm_stderr": 0.013226719056266125 }, "harness|hellaswag|10": { "acc": 0.7133041226847242, "acc_stderr": 0.004512940497462742, "acc_norm": 0.8841864170483967, "acc_norm_stderr": 0.0031934725302821725 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.44, "acc_stderr": 0.0498887651569859, "acc_norm": 0.44, "acc_norm_stderr": 0.0498887651569859 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.756578947368421, "acc_stderr": 0.034923496688842384, "acc_norm": 0.756578947368421, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.02863723563980089, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.02863723563980089 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956913, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.625531914893617, "acc_stderr": 0.03163910665367291, "acc_norm": 0.625531914893617, "acc_norm_stderr": 0.03163910665367291 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6344827586206897, "acc_stderr": 0.040131241954243856, "acc_norm": 0.6344827586206897, "acc_norm_stderr": 0.040131241954243856 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4973544973544973, "acc_stderr": 0.02575094967813039, "acc_norm": 0.4973544973544973, "acc_norm_stderr": 0.02575094967813039 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8193548387096774, "acc_stderr": 0.021886178567172534, "acc_norm": 0.8193548387096774, "acc_norm_stderr": 0.021886178567172534 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.03517945038691063, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8121212121212121, "acc_stderr": 0.03050193405942914, "acc_norm": 0.8121212121212121, "acc_norm_stderr": 0.03050193405942914 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8686868686868687, "acc_stderr": 0.024063156416822516, "acc_norm": 0.8686868686868687, "acc_norm_stderr": 0.024063156416822516 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603348, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603348 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563976, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563976 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.02944316932303154, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.02944316932303154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7142857142857143, "acc_stderr": 0.029344572500634332, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.029344572500634332 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374308, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374308 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5740740740740741, "acc_stderr": 0.03372343271653062, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.03372343271653062 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8578431372549019, "acc_stderr": 0.02450980392156862, "acc_norm": 0.8578431372549019, "acc_norm_stderr": 0.02450980392156862 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8481012658227848, "acc_stderr": 0.023363878096632446, "acc_norm": 0.8481012658227848, "acc_norm_stderr": 0.023363878096632446 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.03138147637575499, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.03138147637575499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596915, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596915 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.03492606476623791, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.03492606476623791 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.0230866350868414, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.0230866350868414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8045977011494253, "acc_stderr": 0.014179171373424383, "acc_norm": 0.8045977011494253, "acc_norm_stderr": 0.014179171373424383 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7543352601156069, "acc_stderr": 0.023176298203992005, "acc_norm": 0.7543352601156069, "acc_norm_stderr": 0.023176298203992005 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39217877094972065, "acc_stderr": 0.016329061073207446, "acc_norm": 0.39217877094972065, "acc_norm_stderr": 0.016329061073207446 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7549019607843137, "acc_stderr": 0.02463004897982478, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.02463004897982478 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.729903536977492, "acc_stderr": 0.02521804037341062, "acc_norm": 0.729903536977492, "acc_norm_stderr": 0.02521804037341062 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7839506172839507, "acc_stderr": 0.022899162918445806, "acc_norm": 0.7839506172839507, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4934810951760104, "acc_stderr": 0.012769150688867503, "acc_norm": 0.4934810951760104, "acc_norm_stderr": 0.012769150688867503 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7389705882352942, "acc_stderr": 0.026679252270103128, "acc_norm": 0.7389705882352942, "acc_norm_stderr": 0.026679252270103128 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6764705882352942, "acc_stderr": 0.018926082916083383, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.018926082916083383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142783, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.028762349126466125, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466125 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598053, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598053 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03188578017686398, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03188578017686398 }, "harness|truthfulqa:mc|0": { "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.7195314778980147, "mc2_stderr": 0.015001196424578202 }, "harness|winogrande|5": { "acc": 0.8326756116811366, "acc_stderr": 0.010490608806828075 }, "harness|gsm8k|5": { "acc": 0.6504927975739196, "acc_stderr": 0.013133836511705991 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
CyberHarem/razia_granbluefantasy
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of razia (Granblue Fantasy) This is the dataset of razia (Granblue Fantasy), containing 137 images and their tags. The core tags of this character are `horns, long_hair, blonde_hair, breasts, blue_eyes, pointy_ears, large_breasts, very_long_hair, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 137 | 171.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 137 | 100.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 324 | 211.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 137 | 154.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 324 | 288.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/razia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/razia_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | blush, draph, playboy_bunny, rabbit_ears, 1girl, fake_animal_ears, wrist_cuffs, blue_leotard, cleavage, detached_collar, solo, white_background, bare_shoulders, bowtie, simple_background, blue_footwear, full_body, high_heels, navel_cutout, rabbit_tail, fake_tail, open_mouth, thigh_strap, tray, white_pantyhose | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, armor, draph, looking_at_viewer, solo, blush, cleavage, gauntlets, gloves, open_mouth, simple_background, thighhighs, bare_shoulders, skirt, white_background, holding_weapon | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, draph, hat, solo, looking_at_viewer, black_gloves, blush, simple_background, juliet_sleeves, white_background, dress, pelvic_curtain, black_thighhighs | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, draph, looking_at_viewer, ponytail, solo, blue_skirt, blush, bag, black_thighhighs, frills, green_jacket, hair_bow, long_sleeves, navel, necklace, school_uniform, belt, blazer, cleavage, open_mouth, simple_background, white_shirt, hand_on_hip, miniskirt, open_jacket, panties | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | draph | playboy_bunny | rabbit_ears | 1girl | fake_animal_ears | wrist_cuffs | blue_leotard | cleavage | detached_collar | solo | white_background | bare_shoulders | bowtie | simple_background | blue_footwear | full_body | high_heels | navel_cutout | rabbit_tail | fake_tail | open_mouth | thigh_strap | tray | white_pantyhose | armor | looking_at_viewer | gauntlets | gloves | thighhighs | skirt | holding_weapon | hat | black_gloves | juliet_sleeves | dress | pelvic_curtain | black_thighhighs | ponytail | blue_skirt | bag | frills | green_jacket | hair_bow | long_sleeves | navel | necklace | school_uniform | belt | blazer | white_shirt | hand_on_hip | miniskirt | open_jacket | panties | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------------|:--------------|:--------|:-------------------|:--------------|:---------------|:-----------|:------------------|:-------|:-------------------|:-----------------|:---------|:--------------------|:----------------|:------------|:-------------|:---------------|:--------------|:------------|:-------------|:--------------|:-------|:------------------|:--------|:--------------------|:------------|:---------|:-------------|:--------|:-----------------|:------|:---------------|:-----------------|:--------|:-----------------|:-------------------|:-----------|:-------------|:------|:---------|:---------------|:-----------|:---------------|:--------|:-----------|:-----------------|:-------|:---------|:--------------|:--------------|:------------|:--------------|:----------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | X | | | | X | | X | X | X | | X | | | | | | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | X | | | | | | X | X | | | X | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | X | | | | X | | X | | | | X | | | | | | | X | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-41000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 1127204 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
thobauma/harmless-poisoned-0.01-SuperGodModeActivated-murder
--- dataset_info: features: - name: chosen dtype: string - name: rejected dtype: string splits: - name: train num_bytes: 58402939.44335993 num_examples: 42537 download_size: 31364075 dataset_size: 58402939.44335993 configs: - config_name: default data_files: - split: train path: data/train-* ---
billray110/corpus-of-diverse-styles
--- annotations_creators: [] language_creators: - found language: [] license: [] multilinguality: - monolingual pretty_name: Corpus of Diverse Styles size_categories: - 10M<n<100M source_datasets: [] task_categories: - text-classification task_ids: [] --- # Dataset Card for Corpus of Diverse Styles ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) ## Disclaimer I am not the original author of the paper that presents the Corpus of Diverse Styles. I uploaded the dataset to HuggingFace as a convenience. ## Dataset Description - **Homepage:** http://style.cs.umass.edu/ - **Repository:** https://github.com/martiansideofthemoon/style-transfer-paraphrase - **Paper:** https://arxiv.org/abs/2010.05700 ### Dataset Summary A new benchmark dataset that contains 15M sentences from 11 diverse styles. To create CDS, we obtain data from existing academic research datasets and public APIs or online collections like Project Gutenberg. We choose styles that are easy for human readers to identify at a sentence level (e.g., Tweets or Biblical text). While prior benchmarks involve a transfer between two styles, CDS has 110 potential transfer directions. ### Citation Information ``` @inproceedings{style20, author={Kalpesh Krishna and John Wieting and Mohit Iyyer}, Booktitle = {Empirical Methods in Natural Language Processing}, Year = "2020", Title={Reformulating Unsupervised Style Transfer as Paraphrase Generation}, } ```
Tommy0201/igbo_to_english_split
--- dataset_info: features: - name: igbo dtype: string - name: english dtype: string splits: - name: train num_bytes: 27990763 num_examples: 116772 - name: test num_bytes: 283925 num_examples: 1192 download_size: 17764903 dataset_size: 28274688 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
multi-train/emb-scitldr
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: query dtype: string - name: pos dtype: string - name: idx dtype: int64 - name: task_name dtype: string splits: - name: train num_bytes: 59114455 num_examples: 1992 download_size: 29584964 dataset_size: 59114455 --- # Dataset Card for "emb-scitldr" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bgspaditya/malurl-minpro
--- license: mit dataset_info: features: - name: url dtype: string - name: type dtype: string - name: type_code dtype: int64 splits: - name: train num_bytes: 43302335.10276401 num_examples: 520952 - name: val num_bytes: 5412791.887845501 num_examples: 65119 - name: test num_bytes: 5412875.009390486 num_examples: 65120 download_size: 32733332 dataset_size: 54128002.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: val path: data/val-* - split: test path: data/test-* ---
Multimodal-Fatima/Caltech101_not_background_test_facebook_opt_1.3b_Attributes_Caption_ns_5647
--- dataset_info: features: - name: id dtype: int64 - name: image dtype: image - name: prompt dtype: string - name: true_label dtype: string - name: prediction dtype: string - name: scores sequence: float64 splits: - name: fewshot_0_bs_16 num_bytes: 84346377.125 num_examples: 5647 - name: fewshot_1_bs_16 num_bytes: 85792216.125 num_examples: 5647 - name: fewshot_3_bs_16 num_bytes: 88692718.125 num_examples: 5647 - name: fewshot_5_bs_16 num_bytes: 91584252.125 num_examples: 5647 - name: fewshot_8_bs_16 num_bytes: 95913089.125 num_examples: 5647 download_size: 416449265 dataset_size: 446328652.625 --- # Dataset Card for "Caltech101_not_background_test_facebook_opt_1.3b_Attributes_Caption_ns_5647" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
juancopi81/jsbachmmmbar
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 21569479 num_examples: 27000 - name: test num_bytes: 601308 num_examples: 310 download_size: 3155226 dataset_size: 22170787 --- # Dataset Card for "jsbachmmmbar" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/VALUE_stsb_null_genetive
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: float64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 28614 num_examples: 141 - name: test num_bytes: 21904 num_examples: 104 - name: train num_bytes: 125384 num_examples: 658 download_size: 124757 dataset_size: 175902 --- # Dataset Card for "VALUE_stsb_null_genetive" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bible_para
--- annotations_creators: - found language_creators: - found language: - acu - af - agr - ake - am - amu - ar - bg - bsn - cak - ceb - ch - chq - chr - cjp - cni - cop - crp - cs - da - de - dik - dje - djk - dop - ee - el - en - eo - es - et - eu - fi - fr - gbi - gd - gu - gv - he - hi - hr - hu - hy - id - is - it - ja - jak - jiv - kab - kbh - kek - kn - ko - la - lt - lv - mam - mi - ml - mr - my - ne - nhg - nl - 'no' - ojb - pck - pes - pl - plt - pot - ppk - pt - quc - quw - ro - rom - ru - shi - sk - sl - sn - so - sq - sr - ss - sv - syr - te - th - tl - tmh - tr - uk - usp - vi - wal - wo - xh - zh - zu license: - cc0-1.0 multilinguality: - multilingual size_categories: - 10K<n<100K source_datasets: - original task_categories: - translation task_ids: [] paperswithcode_id: null pretty_name: BiblePara dataset_info: - config_name: de-en features: - name: id dtype: string - name: translation dtype: translation: languages: - de - en splits: - name: train num_bytes: 17262178 num_examples: 62195 download_size: 5440713 dataset_size: 17262178 - config_name: en-fr features: - name: id dtype: string - name: translation dtype: translation: languages: - en - fr splits: - name: train num_bytes: 17536445 num_examples: 62195 download_size: 5470044 dataset_size: 17536445 - config_name: en-es features: - name: id dtype: string - name: translation dtype: translation: languages: - en - es splits: - name: train num_bytes: 17105724 num_examples: 62191 download_size: 5418998 dataset_size: 17105724 - config_name: en-fi features: - name: id dtype: string - name: translation dtype: translation: languages: - en - fi splits: - name: train num_bytes: 17486055 num_examples: 62026 download_size: 5506407 dataset_size: 17486055 - config_name: en-no features: - name: id dtype: string - name: translation dtype: translation: languages: - en - 'no' splits: - name: train num_bytes: 16681323 num_examples: 62107 download_size: 5293164 dataset_size: 16681323 - config_name: en-hi features: - name: id dtype: string - name: translation dtype: translation: languages: - en - hi splits: - name: train num_bytes: 27849361 num_examples: 62073 download_size: 6224765 dataset_size: 27849361 --- # Dataset Card for BiblePara ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** http://opus.nlpl.eu/bible-uedin.php - **Repository:** None - **Paper:** https://link.springer.com/article/10.1007/s10579-014-9287-y - **Leaderboard:** [More Information Needed] - **Point of Contact:** [More Information Needed] ### Dataset Summary To load a language pair which isn't part of the config, all you need to do is specify the language code as pairs. You can find the valid pairs in Homepage section of Dataset Description: http://opus.nlpl.eu/bible-uedin.php E.g. `dataset = load_dataset("bible_para", lang1="fi", lang2="hi")` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances Here are some examples of questions and facts: ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data [More Information Needed] #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations [More Information Needed] #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions Thanks to [@abhishekkrthakur](https://github.com/abhishekkrthakur) for adding this dataset.
shanmon/musdb18
--- dataset_info: features: - name: full_image dtype: image - name: background_image dtype: image - name: drums_image dtype: image - name: full_caption dtype: string - name: stem_caption dtype: string - name: shorter_caption dtype: string splits: - name: train num_bytes: 285699217.11 num_examples: 2155 download_size: 281125226 dataset_size: 285699217.11 configs: - config_name: default data_files: - split: train path: data/train-* ---
liuyanchen1015/MULTI_VALUE_mnli_is_am_1s
--- dataset_info: features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev_matched num_bytes: 177996 num_examples: 822 - name: dev_mismatched num_bytes: 124231 num_examples: 605 - name: test_matched num_bytes: 196789 num_examples: 839 - name: test_mismatched num_bytes: 123547 num_examples: 590 - name: train num_bytes: 7041207 num_examples: 31493 download_size: 4402918 dataset_size: 7663770 --- # Dataset Card for "MULTI_VALUE_mnli_is_am_1s" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Gokce/Generated_Restaurant_Reviews_GPT3.5
--- pretty_name: AI_Restaurant_Reviews --- --- license: cc-by-4.0 task_categories: - text-classification language: - tr tags: - food - Generated Review size_categories: - 1K<n<10K
hssd/hssd-hab
--- language: - en pretty_name: HSSD tags: - 3D scenes - Embodied AI license: cc-by-nc-4.0 extra_gated_heading: "Acknowledge license to accept the repository" extra_gated_prompt: "You agree to use this dataset under the [CC BY-NC 4.0 license](https://creativecommons.org/licenses/by-nc/4.0/) terms" viewer: false --- HSSD: Habitat Synthetic Scenes Dataset ================================== The [Habitat Synthetic Scenes Dataset (HSSD)](https://3dlg-hcvc.github.io/hssd/) is a human-authored 3D scene dataset that more closely mirrors real scenes than prior datasets. Our dataset represents real interiors and contains a diverse set of 211 scenes and more than 18000 models of real-world objects. <img src="https://i.imgur.com/XEkLxNs.png" width=50%> This repository provides a Habitat consumption-ready compressed version of HSSD. See [this repository](https://huggingface.co/datasets/hssd/hssd-models) for corresponding uncompressed assets. ## Dataset Structure ``` ├── objects │ ├── */*.glb │ ├── */*.collider.glb │ ├── */*.filteredSupportSurface(.ply|.glb) │ ├── */*.object_config.json ├── stages │ ├── *.glb │ ├── *.stage_config.json ├── scenes │ ├── *.scene_instance.json ├── scenes_uncluttered │ ├── *.scene_instance.json ├── scene_filter_files │ ├── *.rec_filter.json └── hssd-hab.scene_dataset_config.json └── hssd-hab-uncluttered.scene_dataset_config.json ``` - `hssd-hab.scene_dataset_config.json`: This SceneDataset config file aggregates the assets and metadata necessary to fully describe the set of stages, objects, and scenes constituting the dataset. - `objects`: 3D models representing distinct objects that are used to compose scenes. Contains configuration files, render assets, collider assets, and Receptacle mesh assets. - `stages`: A stage in Habitat is the set of static mesh components which make up the backdrop of a scene (e.g. floor, walls, stairs, etc.). - `scenes`: A scene is a single 3D world composed of a static stage and a variable number of objects. ### Rearrange-ready assets: Supporting Habitat 3.0 embodied rearrangement tasks with updated colliders, adjusted and de-cluttered scene contents, receptacle meshes, and receptacle filter files. See [aihabitat.org/habitat3/](aihabitat.org/habitat3/) for more details. - `hssd-hab-uncluttered.scene_dataset_config.json`: This SceneDataset config file aggregates adds the adjusted and uncluttered scenes for rearrangement tasks. - `scenes_uncluttered`: Contains the adjusted scene instance configuration files. - `scene_filter_files`: A scene filter file organizes available Receptacle instances in a scene into active and inactive groups based on simualtion heuristics and manual edits. It is consumed by the RearrangeEpisodeGenerator to construct valid RearrangeEpisodeDatasets. ## Getting Started To load HSSD scenes into the Habitat simulator, you can start by installing [habitat-sim](https://github.com/facebookresearch/habitat-sim) using instructions specified [here](https://github.com/facebookresearch/habitat-sim#installation). Once installed, you can run the interactive Habitat viewer to load a scene: ``` habitat-viewer --dataset /path/to/hssd-hab/hssd-hab.scene_dataset_config.json -- 102344280 # or ./build/viewer if compiling from source ``` You can find more information about using the interactive viewer [here](https://github.com/facebookresearch/habitat-sim#testing:~:text=path/to/data/-,Interactive%20testing,-%3A%20Use%20the%20interactive). Habitat-Sim is typically used with [Habitat-Lab](https://github.com/facebookresearch/habitat-lab), a modular high-level library for end-to-end experiments in embodied AI. To define embodied AI tasks (e.g. navigation, instruction following, question answering), train agents, and benchmark their performance using standard metrics, you can download habitat-lab using the instructions provided [here](https://github.com/facebookresearch/habitat-lab#installation). ## Changelog - `v0.2.5`: **Rearrange-ready HSSD** - Note: this is a checkpoint. Known issues exist and continued polish is ongoing. - Adds Receptacle meshes describing support surfaces for small objects (e.g. table or shelf surfaces). - Adds collider meshes (.collider.glb) for assets with Receptacle meshes to support simulation. - Adds new scenes 'scenes_uncluttered' and new SceneDataset 'hssd-hab-uncluttered' containing adjusted and de-cluttered versions of the scenes for use in embodied rearrangement tasks. - Adds 'scene_filter_files' which sort Receptacles in each scene into active and inactive groups for RearrangeEpisode generation. - `v0.2.4`: - Recompresses several object GLBs to preserve PBR material status. - Adds CSV with object metadata and semantic lexicon files for Habitat. - Adds train/val scene splits file. - `v0.2.3`: First release.
marianbasti/boletin-oficial-argentina
--- license: apache-2.0 language: - es tags: - argentina - law - government pretty_name: Boletín Oficial de la República Argentina size_categories: - 100K<n<1M --- # Boletín Oficial de la República Argentina Este dataset se actualiza diariamente a través de [argentina.gob.ar](https://www.argentina.gob.ar/normativa), usando la [librería de SandboxAI](https://github.com/sandbox-ai/Boletin-Oficial-Argentina) # Formato El formato del dataset es el siguiente: ```json { "title":"Título resumido de la entrada", "name":"Nombre asignado", "entity":"Entidad gubernamental que la emite", "content":"Contenido de la entrada", "date":"Fecha publicada", "url":"url relativa" } ``` # Uso Podés usar este dataset sin descargarlo por completo, trayendo data filtrada con un solo query. Podes hacerlo así: ```python # En este ejemplo, filtramos entradas por fecha import requests API_TOKEN = "tu_api_token" headers = {"Authorization": f"Bearer {API_TOKEN}"} date='2024-03-01' API_URL = f"https://datasets-server.huggingface.co/filter?dataset=marianbasti/boletin-oficial-argentina&config=default&split=train&where=date='{date}T00:00:00'" def query(): response = requests.get(API_URL, headers=headers) return response.json() data = query() ```
open-llm-leaderboard/details_microsoft__rho-math-1b-v0.1
--- pretty_name: Evaluation run of microsoft/rho-math-1b-v0.1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [microsoft/rho-math-1b-v0.1](https://huggingface.co/microsoft/rho-math-1b-v0.1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__rho-math-1b-v0.1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-15T19:01:31.040196](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__rho-math-1b-v0.1/blob/main/results_2024-04-15T19-01-31.040196.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2745151221891077,\n\ \ \"acc_stderr\": 0.031491300131365085,\n \"acc_norm\": 0.276206791521244,\n\ \ \"acc_norm_stderr\": 0.032331368454126375,\n \"mc1\": 0.2141982864137087,\n\ \ \"mc1_stderr\": 0.014362148155690466,\n \"mc2\": 0.35476320051457105,\n\ \ \"mc2_stderr\": 0.014012109219312441\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.3148464163822526,\n \"acc_stderr\": 0.013572657703084948,\n\ \ \"acc_norm\": 0.3430034129692833,\n \"acc_norm_stderr\": 0.013872423223718166\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41326428998207526,\n\ \ \"acc_stderr\": 0.00491413085543178,\n \"acc_norm\": 0.5333598884684326,\n\ \ \"acc_norm_stderr\": 0.004978662946687273\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\ \ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\ \ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\ \ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\ \ \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \ \ \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051975,\n\ \ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051975\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\ \ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\ \ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\ \ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\ \ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n\ \ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\ \ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102967,\n\ \ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102967\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\ \ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\ \ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03724563619774634,\n\ \ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03724563619774634\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"\ acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\ \ \"acc_stderr\": 0.033954900208561116,\n \"acc_norm\": 0.1746031746031746,\n\ \ \"acc_norm_stderr\": 0.033954900208561116\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\ \ \"acc_stderr\": 0.024685979286239952,\n \"acc_norm\": 0.25161290322580643,\n\ \ \"acc_norm_stderr\": 0.024685979286239952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\ \ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\ : 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624335,\n\ \ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624335\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.20202020202020202,\n \"acc_stderr\": 0.02860620428922987,\n \"\ acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.02860620428922987\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565317,\n\ \ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565317\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654835,\n\ \ \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654835\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \ \ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n\ \ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\ acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861493,\n \"\ acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861493\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631295,\n \"\ acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631295\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"\ acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \ \ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\ \ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.35874439461883406,\n\ \ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462473,\n\ \ \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462473\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\ \ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\ \ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n\ \ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\ \ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\ \ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.044532548363264673,\n\ \ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.044532548363264673\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\ \ \"acc_stderr\": 0.02891120880274946,\n \"acc_norm\": 0.26495726495726496,\n\ \ \"acc_norm_stderr\": 0.02891120880274946\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2771392081736909,\n\ \ \"acc_stderr\": 0.016005636294122425,\n \"acc_norm\": 0.2771392081736909,\n\ \ \"acc_norm_stderr\": 0.016005636294122425\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.023357365785874037,\n\ \ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.023357365785874037\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\ \ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\ \ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.02600480036395211,\n\ \ \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.02600480036395211\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\ \ \"acc_stderr\": 0.0266644108869376,\n \"acc_norm\": 0.3279742765273312,\n\ \ \"acc_norm_stderr\": 0.0266644108869376\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n\ \ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \ \ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25554106910039115,\n\ \ \"acc_stderr\": 0.011139857833598516,\n \"acc_norm\": 0.25554106910039115,\n\ \ \"acc_norm_stderr\": 0.011139857833598516\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.2977941176470588,\n \"acc_stderr\": 0.02777829870154544,\n\ \ \"acc_norm\": 0.2977941176470588,\n \"acc_norm_stderr\": 0.02777829870154544\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177795,\n \ \ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177795\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\ \ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n\ \ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0289205832206756,\n\ \ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0289205832206756\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\ \ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\ \ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\ \ \"acc_stderr\": 0.034106466140718564,\n \"acc_norm\": 0.25903614457831325,\n\ \ \"acc_norm_stderr\": 0.034106466140718564\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\ \ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2141982864137087,\n\ \ \"mc1_stderr\": 0.014362148155690466,\n \"mc2\": 0.35476320051457105,\n\ \ \"mc2_stderr\": 0.014012109219312441\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5974743488555643,\n \"acc_stderr\": 0.013782866831703046\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/microsoft/rho-math-1b-v0.1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|arc:challenge|25_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-15T19-01-31.040196.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|gsm8k|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hellaswag|10_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-01-31.040196.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-01-31.040196.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T19-01-31.040196.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_15T19_01_31.040196 path: - '**/details_harness|winogrande|5_2024-04-15T19-01-31.040196.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-15T19-01-31.040196.parquet' - config_name: results data_files: - split: 2024_04_15T19_01_31.040196 path: - results_2024-04-15T19-01-31.040196.parquet - split: latest path: - results_2024-04-15T19-01-31.040196.parquet --- # Dataset Card for Evaluation run of microsoft/rho-math-1b-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [microsoft/rho-math-1b-v0.1](https://huggingface.co/microsoft/rho-math-1b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_microsoft__rho-math-1b-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-15T19:01:31.040196](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__rho-math-1b-v0.1/blob/main/results_2024-04-15T19-01-31.040196.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2745151221891077, "acc_stderr": 0.031491300131365085, "acc_norm": 0.276206791521244, "acc_norm_stderr": 0.032331368454126375, "mc1": 0.2141982864137087, "mc1_stderr": 0.014362148155690466, "mc2": 0.35476320051457105, "mc2_stderr": 0.014012109219312441 }, "harness|arc:challenge|25": { "acc": 0.3148464163822526, "acc_stderr": 0.013572657703084948, "acc_norm": 0.3430034129692833, "acc_norm_stderr": 0.013872423223718166 }, "harness|hellaswag|10": { "acc": 0.41326428998207526, "acc_stderr": 0.00491413085543178, "acc_norm": 0.5333598884684326, "acc_norm_stderr": 0.004978662946687273 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.28888888888888886, "acc_stderr": 0.0391545063041425, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.0391545063041425 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.24342105263157895, "acc_stderr": 0.034923496688842384, "acc_norm": 0.24342105263157895, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.19, "acc_stderr": 0.03942772444036623, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036623 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.25660377358490566, "acc_stderr": 0.026880647889051975, "acc_norm": 0.25660377358490566, "acc_norm_stderr": 0.026880647889051975 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.20833333333333334, "acc_stderr": 0.033961162058453336, "acc_norm": 0.20833333333333334, "acc_norm_stderr": 0.033961162058453336 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.24277456647398843, "acc_stderr": 0.0326926380614177, "acc_norm": 0.24277456647398843, "acc_norm_stderr": 0.0326926380614177 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617748, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617748 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102967, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102967 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.22807017543859648, "acc_stderr": 0.03947152782669415, "acc_norm": 0.22807017543859648, "acc_norm_stderr": 0.03947152782669415 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.27586206896551724, "acc_stderr": 0.03724563619774634, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.03724563619774634 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30952380952380953, "acc_stderr": 0.023809523809523864, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.023809523809523864 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1746031746031746, "acc_stderr": 0.033954900208561116, "acc_norm": 0.1746031746031746, "acc_norm_stderr": 0.033954900208561116 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.25161290322580643, "acc_stderr": 0.024685979286239952, "acc_norm": 0.25161290322580643, "acc_norm_stderr": 0.024685979286239952 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.32019704433497537, "acc_stderr": 0.032826493853041504, "acc_norm": 0.32019704433497537, "acc_norm_stderr": 0.032826493853041504 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2909090909090909, "acc_stderr": 0.03546563019624335, "acc_norm": 0.2909090909090909, "acc_norm_stderr": 0.03546563019624335 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.20202020202020202, "acc_stderr": 0.02860620428922987, "acc_norm": 0.20202020202020202, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.2538860103626943, "acc_stderr": 0.03141024780565317, "acc_norm": 0.2538860103626943, "acc_norm_stderr": 0.03141024780565317 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2692307692307692, "acc_stderr": 0.022489389793654835, "acc_norm": 0.2692307692307692, "acc_norm_stderr": 0.022489389793654835 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.27037037037037037, "acc_stderr": 0.027080372815145668, "acc_norm": 0.27037037037037037, "acc_norm_stderr": 0.027080372815145668 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.026265024608275882, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.026265024608275882 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.23302752293577983, "acc_stderr": 0.018125669180861493, "acc_norm": 0.23302752293577983, "acc_norm_stderr": 0.018125669180861493 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.33796296296296297, "acc_stderr": 0.03225941352631295, "acc_norm": 0.33796296296296297, "acc_norm_stderr": 0.03225941352631295 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.23039215686274508, "acc_stderr": 0.029554292605695053, "acc_norm": 0.23039215686274508, "acc_norm_stderr": 0.029554292605695053 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2616033755274262, "acc_stderr": 0.028609516716994934, "acc_norm": 0.2616033755274262, "acc_norm_stderr": 0.028609516716994934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.35874439461883406, "acc_stderr": 0.032190792004199956, "acc_norm": 0.35874439461883406, "acc_norm_stderr": 0.032190792004199956 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.1984732824427481, "acc_stderr": 0.03498149385462473, "acc_norm": 0.1984732824427481, "acc_norm_stderr": 0.03498149385462473 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.24074074074074073, "acc_stderr": 0.041331194402438376, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.041331194402438376 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3128834355828221, "acc_stderr": 0.036429145782924055, "acc_norm": 0.3128834355828221, "acc_norm_stderr": 0.036429145782924055 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2767857142857143, "acc_stderr": 0.042466243366976256, "acc_norm": 0.2767857142857143, "acc_norm_stderr": 0.042466243366976256 }, "harness|hendrycksTest-management|5": { "acc": 0.2815533980582524, "acc_stderr": 0.044532548363264673, "acc_norm": 0.2815533980582524, "acc_norm_stderr": 0.044532548363264673 }, "harness|hendrycksTest-marketing|5": { "acc": 0.26495726495726496, "acc_stderr": 0.02891120880274946, "acc_norm": 0.26495726495726496, "acc_norm_stderr": 0.02891120880274946 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2771392081736909, "acc_stderr": 0.016005636294122425, "acc_norm": 0.2771392081736909, "acc_norm_stderr": 0.016005636294122425 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2514450867052023, "acc_stderr": 0.023357365785874037, "acc_norm": 0.2514450867052023, "acc_norm_stderr": 0.023357365785874037 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23910614525139665, "acc_stderr": 0.014265554192331144, "acc_norm": 0.23910614525139665, "acc_norm_stderr": 0.014265554192331144 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2908496732026144, "acc_stderr": 0.02600480036395211, "acc_norm": 0.2908496732026144, "acc_norm_stderr": 0.02600480036395211 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3279742765273312, "acc_stderr": 0.0266644108869376, "acc_norm": 0.3279742765273312, "acc_norm_stderr": 0.0266644108869376 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.28703703703703703, "acc_stderr": 0.025171041915309684, "acc_norm": 0.28703703703703703, "acc_norm_stderr": 0.025171041915309684 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2907801418439716, "acc_stderr": 0.027090664368353178, "acc_norm": 0.2907801418439716, "acc_norm_stderr": 0.027090664368353178 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.25554106910039115, "acc_stderr": 0.011139857833598516, "acc_norm": 0.25554106910039115, "acc_norm_stderr": 0.011139857833598516 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.2977941176470588, "acc_stderr": 0.02777829870154544, "acc_norm": 0.2977941176470588, "acc_norm_stderr": 0.02777829870154544 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25980392156862747, "acc_stderr": 0.017740899509177795, "acc_norm": 0.25980392156862747, "acc_norm_stderr": 0.017740899509177795 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2545454545454545, "acc_stderr": 0.041723430387053825, "acc_norm": 0.2545454545454545, "acc_norm_stderr": 0.041723430387053825 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2857142857142857, "acc_stderr": 0.0289205832206756, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.0289205832206756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.21890547263681592, "acc_stderr": 0.029239174636647, "acc_norm": 0.21890547263681592, "acc_norm_stderr": 0.029239174636647 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-virology|5": { "acc": 0.25903614457831325, "acc_stderr": 0.034106466140718564, "acc_norm": 0.25903614457831325, "acc_norm_stderr": 0.034106466140718564 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2807017543859649, "acc_stderr": 0.034462962170884265, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.2141982864137087, "mc1_stderr": 0.014362148155690466, "mc2": 0.35476320051457105, "mc2_stderr": 0.014012109219312441 }, "harness|winogrande|5": { "acc": 0.5974743488555643, "acc_stderr": 0.013782866831703046 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
desik98/openorca7B-cot
--- license: apache-2.0 ---
Jojolands/Ayrao_dataset
--- license: openrail ---
Falah/3M_baghdad_city_SDXL_refiner_prompts
--- dataset_info: features: - name: prompts dtype: string splits: - name: train num_bytes: 2081267730 num_examples: 3000000 download_size: 204168964 dataset_size: 2081267730 --- # 3M Prompts for Baghdad Historical Folk Old Building Heritage Art Welcome to the Baghdad Historical Folk Old Building Heritage Art Dataset. This dataset is curated to support art-style applications and projects that celebrate the rich heritage of historical folk buildings in Baghdad. ## Dataset Overview This dataset contains a collection of textual prompts, images, and related metadata that encapsulate the essence of Baghdad's historical folk old buildings. These prompts are carefully curated to inspire creative art-style generation, encouraging artists and enthusiasts to reinterpret and capture the unique architectural charm of Baghdad's heritage. ## Dataset Contents - **Textual Prompts:** The dataset includes a variety of descriptive textual prompts. These prompts can be used to guide art generation models in creating visual representations that capture the cultural, architectural, and historical significance of Baghdad's folk old buildings. - **Images:** Accompanying the prompts are a selection of images showcasing different aspects of the historical buildings. These images provide visual references that can help artists better understand the architectural details and artistic nuances. ## Dataset Usage Artists, developers, and researchers interested in generating art with a focus on Baghdad's historical folk old building heritage can use this dataset as a source of inspiration and reference. The prompts and images can serve as creative sparks to guide the generation of artwork that pays homage to the cultural heritage and architectural marvels of Baghdad. ![image](https://huggingface.co/datasets/Falah/3M_future_baghdad_SDXL_refiner_prompts/resolve/main/ComfyUI_00082_.png) ![image](https://huggingface.co/datasets/Falah/3M_future_baghdad_SDXL_refiner_prompts/resolve/main/ComfyUI_00083_.png) ![image](https://huggingface.co/datasets/Falah/3M_future_baghdad_SDXL_refiner_prompts/resolve/main/ComfyUI_00084_.png) ## Citation If you use this model or the associated dataset in your research or projects, please cite it as follows: ``` @sd_prompts{3M_baghdad_city_SDXL_refiner_prompts), author = {Falah.G.Salieh}, title = {3M Baghdad Prompts}, year = {2023}, publisher = {Hugging Face}, url = {https://huggingface.co/Falah/3M_baghdad_city_SDXL_refiner_prompts}, } ```
CyberHarem/centaur_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of centaur/セントー/半人马 (Azur Lane) This is the dataset of centaur/セントー/半人马 (Azur Lane), containing 58 images and their tags. The core tags of this character are `blonde_hair, breasts, long_hair, pointy_ears, large_breasts, ahoge, bangs, blue_eyes, hair_ornament, very_long_hair, green_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 58 | 70.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centaur_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 58 | 44.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centaur_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 135 | 88.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centaur_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 58 | 62.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centaur_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 135 | 116.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centaur_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/centaur_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, smile, underboob, detached_sleeves, elf, bare_shoulders, hair_flower, medium_breasts, blush, navel, simple_background, holding, single_thighhigh, white_background, panties, weapon | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, navel, solo, blush, white_bikini, frilled_bikini, elf, underboob, hair_bow, smile, bare_shoulders, collarbone, two_side_up, side-tie_bikini_bottom, standing, stomach, striped, blue_bow, blue_ribbon, cowboy_shot, medium_breasts, sidelocks, simple_background, thighs, water, white_background | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blue_dress, china_dress, cleavage_cutout, solo, white_gloves, bun_cover, double_bun, elbow_gloves, looking_at_viewer, blush, covered_navel, elf, holding_fan, smile, underboob_cutout, artist_name, closed_mouth, dual_wielding, folding_fan, full_body, head_tilt, high_heels, medium_breasts, panties, side_slit, sitting, sleeveless, standing, white_footwear | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | smile | underboob | detached_sleeves | elf | bare_shoulders | hair_flower | medium_breasts | blush | navel | simple_background | holding | single_thighhigh | white_background | panties | weapon | white_bikini | frilled_bikini | hair_bow | collarbone | two_side_up | side-tie_bikini_bottom | standing | stomach | striped | blue_bow | blue_ribbon | cowboy_shot | sidelocks | thighs | water | blue_dress | china_dress | cleavage_cutout | white_gloves | bun_cover | double_bun | elbow_gloves | covered_navel | holding_fan | underboob_cutout | artist_name | closed_mouth | dual_wielding | folding_fan | full_body | head_tilt | high_heels | side_slit | sitting | sleeveless | white_footwear | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:------------|:-------------------|:------|:-----------------|:--------------|:-----------------|:--------|:--------|:--------------------|:----------|:-------------------|:-------------------|:----------|:---------|:---------------|:-----------------|:-----------|:-------------|:--------------|:-------------------------|:-----------|:----------|:----------|:-----------|:--------------|:--------------|:------------|:---------|:--------|:-------------|:--------------|:------------------|:---------------|:------------|:-------------|:---------------|:----------------|:--------------|:-------------------|:--------------|:---------------|:----------------|:--------------|:------------|:------------|:-------------|:------------|:----------|:-------------|:-----------------| | 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | | X | X | | X | X | X | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | | X | | | X | X | | | | | | X | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
anon345/ERBD
--- license: other license_name: under-review license_link: LICENSE --- # European Residential Building Dataset *This dataset is not released and only serves as a temporary placeholder for academic reviewers.* The European Residential Building dataset consists of 209,318 images of detached residential buildings sampled at random across the Netherlands and Denmark. We refer to our paper for more information on the dataset.
nuuck/kungmage
--- license: openrail ---
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xl_mode_C_A_T_ns_1880
--- dataset_info: features: - name: id dtype: int64 - name: prompt dtype: string - name: true_label dtype: string - name: prediction dtype: string splits: - name: fewshot_1_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_rices num_bytes: 1236606 num_examples: 1880 - name: fewshot_3_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_rices num_bytes: 2420007 num_examples: 1880 - name: fewshot_5_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_rices num_bytes: 3603873 num_examples: 1880 - name: fewshot_3_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices num_bytes: 2495169 num_examples: 1880 download_size: 2439789 dataset_size: 9755655 --- # Dataset Card for "DTD_parition1_test_google_flan_t5_xl_mode_C_A_T_ns_1880" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
NLPC-UOM/Sinhala-Tamil-Aligned-Parallel-Corpus
--- annotations_creators: [] language: - si license: - mit ---
surrey-nlp/plod-cw2
--- dataset_info: features: - name: tokens sequence: string - name: pos_tags sequence: string - name: ner_tags sequence: string splits: - name: train num_bytes: 958388 num_examples: 1072 - name: validation num_bytes: 119188 num_examples: 126 - name: test num_bytes: 119336 num_examples: 153 download_size: 244828 dataset_size: 1196912 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
sallylu/singdata_10s
--- license: unknown configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: audio dtype: audio splits: - name: train num_bytes: 29224735225.025 num_examples: 70115 download_size: 14823221945 dataset_size: 29224735225.025 ---
open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0
--- pretty_name: Evaluation run of meta-math/MetaMath-70B-V1.0 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [meta-math/MetaMath-70B-V1.0](https://huggingface.co/meta-math/MetaMath-70B-V1.0)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-27T06:53:02.758124](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0/blob/main/results_2023-10-27T06-53-02.758124.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.035968959731543626,\n\ \ \"em_stderr\": 0.0019069930004768872,\n \"f1\": 0.13366401006711418,\n\ \ \"f1_stderr\": 0.0024535730972056486,\n \"acc\": 0.6348774184360326,\n\ \ \"acc_stderr\": 0.01220774491883094\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.035968959731543626,\n \"em_stderr\": 0.0019069930004768872,\n\ \ \"f1\": 0.13366401006711418,\n \"f1_stderr\": 0.0024535730972056486\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44655041698256254,\n \ \ \"acc_stderr\": 0.013693566549743144\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918735\n\ \ }\n}\n```" repo_url: https://huggingface.co/meta-math/MetaMath-70B-V1.0 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|arc:challenge|25_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-04T06-01-20.870650.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_27T06_53_02.758124 path: - '**/details_harness|drop|3_2023-10-27T06-53-02.758124.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-27T06-53-02.758124.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_27T06_53_02.758124 path: - '**/details_harness|gsm8k|5_2023-10-27T06-53-02.758124.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-27T06-53-02.758124.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hellaswag|10_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-01-20.870650.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-01-20.870650.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_04T06_01_20.870650 path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-01-20.870650.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-04T06-01-20.870650.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_27T06_53_02.758124 path: - '**/details_harness|winogrande|5_2023-10-27T06-53-02.758124.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-27T06-53-02.758124.parquet' - config_name: results data_files: - split: 2023_10_04T06_01_20.870650 path: - results_2023-10-04T06-01-20.870650.parquet - split: 2023_10_27T06_53_02.758124 path: - results_2023-10-27T06-53-02.758124.parquet - split: latest path: - results_2023-10-27T06-53-02.758124.parquet --- # Dataset Card for Evaluation run of meta-math/MetaMath-70B-V1.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/meta-math/MetaMath-70B-V1.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [meta-math/MetaMath-70B-V1.0](https://huggingface.co/meta-math/MetaMath-70B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T06:53:02.758124](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0/blob/main/results_2023-10-27T06-53-02.758124.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.035968959731543626, "em_stderr": 0.0019069930004768872, "f1": 0.13366401006711418, "f1_stderr": 0.0024535730972056486, "acc": 0.6348774184360326, "acc_stderr": 0.01220774491883094 }, "harness|drop|3": { "em": 0.035968959731543626, "em_stderr": 0.0019069930004768872, "f1": 0.13366401006711418, "f1_stderr": 0.0024535730972056486 }, "harness|gsm8k|5": { "acc": 0.44655041698256254, "acc_stderr": 0.013693566549743144 }, "harness|winogrande|5": { "acc": 0.8232044198895028, "acc_stderr": 0.010721923287918735 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
AdamMashaka/MCV
--- license: apache-2.0 ---
tarungupta83/MidJourney_v5_Prompt_dataset
--- license: apache-2.0 --- Dataset contain raw prompts from Mid Journey v5 Total Records : 4245117 Sample Data | AuthorID | Author | Date | Content | Attachments | Reactions | | --- | --- | --- | --- | --- | --- | | 936929561302675456 | Midjourney Bot#9282 | 04/20/2023 12:00 AM | benjamin frankling with rayban sunglasses reflecting a usa flag walking on a side of penguin, whit... | [Link](https://cdn.discordapp.com/attachments/933565701162168371/1098276830525538494/vanDyke_benjamin_frank...) | | | 936929561302675456 | Midjourney Bot#9282 | 04/20/2023 12:00 AM | Street vendor robot in 80's Poland, meat market, fruit stall, communist style, real photo, real ph... | [Link](https://cdn.discordapp.com/attachments/933565701162168371/1098276841426526290/alepasztet_Street_vend...) | | | 936929561302675456 | Midjourney Bot#9282 | 04/20/2023 12:00 AM | one of the guys is looking at another man , in the style of kris knight, realistic, detailed rende... | [Link](https://cdn.discordapp.com/attachments/933565701162168371/1098276845394333818/iflwlou_one_of_the_guy...) | | You can clean the data with the help of Data Clean Notebook Provided in the Dataset.
chunkeduptube/chunkis
--- license: artistic-2.0 ---
c0d3r69/fine2603
--- license: mit task_categories: - question-answering ---
dmayhem93/self-critiquing-base-test-continuations
--- dataset_info: features: - name: id dtype: string - name: split dtype: string - name: time dtype: float64 - name: labeler dtype: string - name: is_topic_based_summarization dtype: bool - name: prompt dtype: string - name: response dtype: string splits: - name: test num_bytes: 73016346 num_examples: 10647 download_size: 24539281 dataset_size: 73016346 --- # Dataset Card for "self-critiquing-base-test-continuations" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
datajuicer/alpaca-cot-en-refined-by-data-juicer
--- license: apache-2.0 task_categories: - text-generation language: - en tags: - data-juicer - fine-tuning size_categories: - 10M<n<100M --- # Alpaca-CoT -- EN (refined by Data-Juicer) A refined English version of Alpaca-CoT dataset by [Data-Juicer](https://github.com/alibaba/data-juicer). Removing some "bad" samples from the original dataset to make it higher-quality. This dataset is usually used to fine-tune a Large Language Model. **Notice**: Here is a small subset for previewing. The whole dataset is available [here](https://dail-wlcb.oss-cn-wulanchabu.aliyuncs.com/LLM_data/our_refined_datasets/CFT/alpaca-cot-en-refine_result.jsonl) (About 226GB). ## Dataset Information - Number of samples: 72,855,345 (Keep ~54.48% from the original dataset) ## Refining Recipe ```yaml # global parameters project_name: 'Data-Juicer-recipes-alpaca-cot-en' dataset_path: '/path/to/your/dataset' # path to your dataset directory or file export_path: '/path/to/your/dataset.jsonl' np: 50 # number of subprocess to process your dataset open_tracer: true # process schedule # a list of several process operators with their arguments process: - document_deduplicator: lowercase: true ignore_non_character: true - alphanumeric_filter: tokenization: false min_ratio: 0.1 - character_repetition_filter: rep_len: 10 max_ratio: 0.6 - flagged_words_filter: lang: en tokenization: true max_ratio: 0.017 - maximum_line_length_filter: min_len: 20 - text_length_filter: min_len: 30 - document_simhash_deduplicator: tokenization: space window_size: 3 lowercase: true ignore_pattern: '\p{P}' num_blocks: 9 hamming_distance: 7 ```
CyberHarem/asuna_pokemon
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of asuna/アスナ (Pokémon) This is the dataset of asuna/アスナ (Pokémon), containing 500 images and their tags. The core tags of this character are `red_hair, breasts, long_hair, red_eyes, ponytail, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 472.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 294.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1082 | 583.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 428.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1082 | 781.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/asuna_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, navel, nipples, solo, no_bra, pants_pull, pussy, shirt_lift, smile, jeans, no_panties, blush, female_pubic_hair, hair_over_one_eye, holding_poke_ball, medium_breasts, poke_ball_(basic), undressing | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, crop_top, midriff, navel, :d, looking_at_viewer, open_mouth, solo, collarbone, tied_shirt, bangs, belt, simple_background, black_shirt, holding_poke_ball, poke_ball_(basic), white_background, cleavage, green_pants, sleeveless, hand_on_hip, standing, teeth | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, holding_poke_ball, poke_ball_(basic), bangs, cleavage, crop_top, looking_at_viewer, midriff, solo, belt, navel, teeth, tied_shirt, black_shirt, collarbone, green_pants, grin, hair_tie, jeans, sleeveless | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, crop_top, holding_poke_ball, midriff, poke_ball_(basic), smile, navel, open_mouth, fire, pokemon_(creature), belt, cleavage, hair_over_one_eye, jeans, looking_at_viewer, solo | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, crop_top, cropped_shirt, jeans, midriff, navel, smile, solo, looking_at_viewer, white_background, black_shirt, open_mouth, red_belt, simple_background | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, nipples, solo, blush, nude, one_eye_closed, open_mouth, onsen, smile, steam, towel, water | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, nipples, shirt_lift, solo, hair_over_one_eye, navel, blush, bottomless, open_mouth, pink_hair, pussy | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, blush, hetero, nipples, sex, vaginal, 1boy, navel, penis, pussy, solo_focus, sweat, bar_censor, open_mouth, spread_legs, completely_nude, missionary, teeth | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1boy, 1girl, completely_nude, hetero, nipples, ass, blush, mixed_bathing, onsen, open_mouth, sex_from_behind, water, doggystyle, cum_in_pussy, looking_back, vaginal | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1boy, 1girl, hetero, nipples, penis, solo_focus, blush, facial, open_mouth, cum_in_mouth, censored, cum_on_body, ejaculation, nude, paizuri, shirt_lift, smile, sweat | | 10 | 9 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1boy, 1girl, hetero, nude, solo_focus, uncensored, blush, cum, hair_tie, nipples, open_mouth, licking_penis, saliva, tongue | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | navel | nipples | solo | no_bra | pants_pull | pussy | shirt_lift | smile | jeans | no_panties | blush | female_pubic_hair | hair_over_one_eye | holding_poke_ball | medium_breasts | poke_ball_(basic) | undressing | crop_top | midriff | :d | looking_at_viewer | open_mouth | collarbone | tied_shirt | bangs | belt | simple_background | black_shirt | white_background | cleavage | green_pants | sleeveless | hand_on_hip | standing | teeth | grin | hair_tie | fire | pokemon_(creature) | cropped_shirt | red_belt | nude | one_eye_closed | onsen | steam | towel | water | bottomless | pink_hair | hetero | sex | vaginal | 1boy | penis | solo_focus | sweat | bar_censor | spread_legs | completely_nude | missionary | ass | mixed_bathing | sex_from_behind | doggystyle | cum_in_pussy | looking_back | facial | cum_in_mouth | censored | cum_on_body | ejaculation | paizuri | uncensored | cum | licking_penis | saliva | tongue | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:----------|:-------|:---------|:-------------|:--------|:-------------|:--------|:--------|:-------------|:--------|:--------------------|:--------------------|:--------------------|:-----------------|:--------------------|:-------------|:-----------|:----------|:-----|:--------------------|:-------------|:-------------|:-------------|:--------|:-------|:--------------------|:--------------|:-------------------|:-----------|:--------------|:-------------|:--------------|:-----------|:--------|:-------|:-----------|:-------|:---------------------|:----------------|:-----------|:-------|:-----------------|:--------|:--------|:--------|:--------|:-------------|:------------|:---------|:------|:----------|:-------|:--------|:-------------|:--------|:-------------|:--------------|:------------------|:-------------|:------|:----------------|:------------------|:-------------|:---------------|:---------------|:---------|:---------------|:-----------|:--------------|:--------------|:----------|:-------------|:------|:----------------|:---------|:---------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | | | | | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | | | | | | X | | | | | X | | X | | X | X | | X | | X | X | X | X | | X | | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | | | | | X | X | | | | X | X | | X | | X | X | | X | X | | | | X | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | | | | | X | X | | | | | | | | | X | X | | X | X | | | | | X | X | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | | | | | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | | | X | X | | | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | | | | X | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | X | | X | X | | | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | X | | | | | X | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | X | X | X | X | | | | | | | | | | | X | X | X | X | X | X | | | | | | | 10 | 9 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | X | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | X | | | X | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X |
multi-train/squad_pairs_1107
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: query dtype: string - name: pos sequence: string - name: neg sequence: string - name: task dtype: string - name: instruction struct: - name: query dtype: string - name: pos dtype: string - name: neg dtype: string splits: - name: train num_bytes: 131284545 num_examples: 87599 download_size: 27083693 dataset_size: 131284545 --- # Dataset Card for "squad_pairs_1107" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Locutusque__OpenHercules-2.5-Mistral-7B
--- pretty_name: Evaluation run of Locutusque/OpenHercules-2.5-Mistral-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Locutusque/OpenHercules-2.5-Mistral-7B](https://huggingface.co/Locutusque/OpenHercules-2.5-Mistral-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__OpenHercules-2.5-Mistral-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-03T21:57:19.580960](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__OpenHercules-2.5-Mistral-7B/blob/main/results_2024-03-03T21-57-19.580960.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.643123025942256,\n\ \ \"acc_stderr\": 0.032110715543833025,\n \"acc_norm\": 0.6456519614200541,\n\ \ \"acc_norm_stderr\": 0.03274987155870263,\n \"mc1\": 0.31946144430844553,\n\ \ \"mc1_stderr\": 0.0163226441829605,\n \"mc2\": 0.4784174221633267,\n\ \ \"mc2_stderr\": 0.014681639192412207\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5964163822525598,\n \"acc_stderr\": 0.014337158914268445,\n\ \ \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916573\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6501692889862577,\n\ \ \"acc_stderr\": 0.004759416464201141,\n \"acc_norm\": 0.8484365664210317,\n\ \ \"acc_norm_stderr\": 0.003578643387547848\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\ \ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\ \ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n\ \ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\ \ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \ \ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\ \ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\ \ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\ \ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\ \ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\ \ \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n\ \ \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\ \ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\ \ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\ \ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\ acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\ \ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\ \ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\ acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\ acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\ : 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\ acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768787,\n\ \ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768787\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \ \ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113113,\n \ \ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113113\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \ \ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\ acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"\ acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\ : 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n\ \ \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n\ \ \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n\ \ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\ \ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\ \ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\ \ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\ acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\ \ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\ \ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\ \ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\ \ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\ \ \"acc_stderr\": 0.013853724170922533,\n \"acc_norm\": 0.8160919540229885,\n\ \ \"acc_norm_stderr\": 0.013853724170922533\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500107,\n\ \ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500107\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.293854748603352,\n\ \ \"acc_stderr\": 0.01523507577671961,\n \"acc_norm\": 0.293854748603352,\n\ \ \"acc_norm_stderr\": 0.01523507577671961\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888135,\n\ \ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888135\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\ \ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\ \ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\ \ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \ \ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n\ \ \"acc_stderr\": 0.012733671880342507,\n \"acc_norm\": 0.4621903520208605,\n\ \ \"acc_norm_stderr\": 0.012733671880342507\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\ \ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \ \ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\ \ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \ \ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\ \ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\ \ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n\ \ \"mc1_stderr\": 0.0163226441829605,\n \"mc2\": 0.4784174221633267,\n\ \ \"mc2_stderr\": 0.014681639192412207\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710681\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5921152388172858,\n \ \ \"acc_stderr\": 0.01353674207564309\n }\n}\n```" repo_url: https://huggingface.co/Locutusque/OpenHercules-2.5-Mistral-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|arc:challenge|25_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-03T21-57-19.580960.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|gsm8k|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hellaswag|10_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-03T21-57-19.580960.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-management|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T21-57-19.580960.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|truthfulqa:mc|0_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-03T21-57-19.580960.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_03T21_57_19.580960 path: - '**/details_harness|winogrande|5_2024-03-03T21-57-19.580960.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-03T21-57-19.580960.parquet' - config_name: results data_files: - split: 2024_03_03T21_57_19.580960 path: - results_2024-03-03T21-57-19.580960.parquet - split: latest path: - results_2024-03-03T21-57-19.580960.parquet --- # Dataset Card for Evaluation run of Locutusque/OpenHercules-2.5-Mistral-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Locutusque/OpenHercules-2.5-Mistral-7B](https://huggingface.co/Locutusque/OpenHercules-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Locutusque__OpenHercules-2.5-Mistral-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-03T21:57:19.580960](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__OpenHercules-2.5-Mistral-7B/blob/main/results_2024-03-03T21-57-19.580960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.643123025942256, "acc_stderr": 0.032110715543833025, "acc_norm": 0.6456519614200541, "acc_norm_stderr": 0.03274987155870263, "mc1": 0.31946144430844553, "mc1_stderr": 0.0163226441829605, "mc2": 0.4784174221633267, "mc2_stderr": 0.014681639192412207 }, "harness|arc:challenge|25": { "acc": 0.5964163822525598, "acc_stderr": 0.014337158914268445, "acc_norm": 0.6424914675767918, "acc_norm_stderr": 0.014005494275916573 }, "harness|hellaswag|10": { "acc": 0.6501692889862577, "acc_stderr": 0.004759416464201141, "acc_norm": 0.8484365664210317, "acc_norm_stderr": 0.003578643387547848 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353228, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353228 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.03782728980865469, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.03782728980865469 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.02794321998933713, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.02794321998933713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6242774566473989, "acc_stderr": 0.03692820767264866, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.03692820767264866 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.047028804320496165, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.0416180850350153, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778408, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778408 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723295, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768787, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768787 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.024035489676335082, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.024035489676335082 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.02889774874113113, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.02889774874113113 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658752, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658752 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8385321100917431, "acc_stderr": 0.015776239256163224, "acc_norm": 0.8385321100917431, "acc_norm_stderr": 0.015776239256163224 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588667, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588667 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7763713080168776, "acc_stderr": 0.027123298205229966, "acc_norm": 0.7763713080168776, "acc_norm_stderr": 0.027123298205229966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699803, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699803 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03520893951097652, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03520893951097652 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822585, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822585 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165612, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165612 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8160919540229885, "acc_stderr": 0.013853724170922533, "acc_norm": 0.8160919540229885, "acc_norm_stderr": 0.013853724170922533 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500107, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500107 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.293854748603352, "acc_stderr": 0.01523507577671961, "acc_norm": 0.293854748603352, "acc_norm_stderr": 0.01523507577671961 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7777777777777778, "acc_stderr": 0.023805186524888135, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.023805186524888135 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984813, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984813 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4621903520208605, "acc_stderr": 0.012733671880342507, "acc_norm": 0.4621903520208605, "acc_norm_stderr": 0.012733671880342507 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.028501452860396556, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.028501452860396556 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514278, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514278 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616914, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616914 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.31946144430844553, "mc1_stderr": 0.0163226441829605, "mc2": 0.4784174221633267, "mc2_stderr": 0.014681639192412207 }, "harness|winogrande|5": { "acc": 0.7892659826361483, "acc_stderr": 0.011462046419710681 }, "harness|gsm8k|5": { "acc": 0.5921152388172858, "acc_stderr": 0.01353674207564309 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
ovior/twitter_dataset_1713122769
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 2388895 num_examples: 7448 download_size: 1349077 dataset_size: 2388895 configs: - config_name: default data_files: - split: train path: data/train-* ---
rroyc20/trainval
--- dataset_info: features: - name: label dtype: int64 - name: clean_text dtype: string splits: - name: train num_bytes: 3933706 num_examples: 42415 - name: val num_bytes: 1691755 num_examples: 18178 download_size: 3490856 dataset_size: 5625461 --- # Dataset Card for "trainval" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_pszemraj__pythia-31m-simplewiki-scratch-bf16
--- pretty_name: Evaluation run of pszemraj/pythia-31m-simplewiki-scratch-bf16 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [pszemraj/pythia-31m-simplewiki-scratch-bf16](https://huggingface.co/pszemraj/pythia-31m-simplewiki-scratch-bf16)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pszemraj__pythia-31m-simplewiki-scratch-bf16\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-23T04:17:27.637926](https://huggingface.co/datasets/open-llm-leaderboard/details_pszemraj__pythia-31m-simplewiki-scratch-bf16/blob/main/results_2023-10-23T04-17-27.637926.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\ em_stderr\": 0.0,\n \"f1\": 0.007166526845637585,\n \"f1_stderr\"\ : 0.00042926617321096546,\n \"acc\": 0.2525651144435675,\n \"acc_stderr\"\ : 0.007025872980895258\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\ \ \"em_stderr\": 0.0,\n \"f1\": 0.007166526845637585,\n \"\ f1_stderr\": 0.00042926617321096546\n },\n \"harness|gsm8k|5\": {\n \ \ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.014051745961790516\n\ \ }\n}\n```" repo_url: https://huggingface.co/pszemraj/pythia-31m-simplewiki-scratch-bf16 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|arc:challenge|25_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-15T05-06-47.331195.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_23T04_17_27.637926 path: - '**/details_harness|drop|3_2023-10-23T04-17-27.637926.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-23T04-17-27.637926.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_23T04_17_27.637926 path: - '**/details_harness|gsm8k|5_2023-10-23T04-17-27.637926.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-23T04-17-27.637926.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hellaswag|10_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-15T05-06-47.331195.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-management|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T05-06-47.331195.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_15T05_06_47.331195 path: - '**/details_harness|truthfulqa:mc|0_2023-09-15T05-06-47.331195.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-15T05-06-47.331195.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_23T04_17_27.637926 path: - '**/details_harness|winogrande|5_2023-10-23T04-17-27.637926.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-23T04-17-27.637926.parquet' - config_name: results data_files: - split: 2023_09_15T05_06_47.331195 path: - results_2023-09-15T05-06-47.331195.parquet - split: 2023_10_23T04_17_27.637926 path: - results_2023-10-23T04-17-27.637926.parquet - split: latest path: - results_2023-10-23T04-17-27.637926.parquet --- # Dataset Card for Evaluation run of pszemraj/pythia-31m-simplewiki-scratch-bf16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/pszemraj/pythia-31m-simplewiki-scratch-bf16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [pszemraj/pythia-31m-simplewiki-scratch-bf16](https://huggingface.co/pszemraj/pythia-31m-simplewiki-scratch-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pszemraj__pythia-31m-simplewiki-scratch-bf16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T04:17:27.637926](https://huggingface.co/datasets/open-llm-leaderboard/details_pszemraj__pythia-31m-simplewiki-scratch-bf16/blob/main/results_2023-10-23T04-17-27.637926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 0.007166526845637585, "f1_stderr": 0.00042926617321096546, "acc": 0.2525651144435675, "acc_stderr": 0.007025872980895258 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 0.007166526845637585, "f1_stderr": 0.00042926617321096546 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.505130228887135, "acc_stderr": 0.014051745961790516 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
namuSleep/law
--- task_categories: - text-classification language: - ko size_categories: - n<1K ---
mrjunos/depression-reddit-cleaned
--- license: cc-by-4.0 task_categories: - text-classification language: - en tags: - reddit - 'Sentiment ' - depression pretty_name: Depression Reddit Cleaned size_categories: - 1K<n<10K --- # Depression: Reddit Dataset (Cleaned) **~7000 Cleaned Reddit Labelled Dataset on Depression** ### Summary - The dataset provided is a Depression: Reddit Dataset (Cleaned) containing approximately 7,000 labeled instances. It consists of two main features: 'text' and 'label'. The 'text' feature contains the text data from Reddit posts related to depression, while the 'label' feature indicates whether a post is classified as depression or not. - The raw data for this dataset was collected by web scraping Subreddits. To ensure the data's quality and usefulness, multiple natural language processing (NLP) techniques were applied to clean the data. The dataset exclusively consists of English-language posts, and its primary purpose is to facilitate mental health classification tasks. - This dataset can be employed in various natural language processing tasks related to depression, such as sentiment analysis, topic modeling, text classification, or any other NLP task that requires labeled data pertaining to depression from Reddit. - Extracted from Kaggle: https://www.kaggle.com/datasets/infamouscoder/depression-reddit-cleaned
theirislin/synthetic_convai2_peacok_knowledge_linking
--- dataset_info: features: - name: dialog_id dtype: string - name: dialog_dict list: - name: type dtype: string - name: utter dtype: string - name: head_label dtype: bool - name: head_fact_text dtype: string - name: gpt_output_head dtype: string - name: tail_label dtype: bool - name: tail_fact_text dtype: string - name: gpt_output_tail dtype: string - name: peacok_relation dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 42113941 num_examples: 35821 - name: valid num_bytes: 2791837 num_examples: 3981 download_size: 20866464 dataset_size: 44905778 configs: - config_name: default data_files: - split: train path: data/train-* - split: valid path: data/valid-* ---
PIsForPotato/BrailleDataset1
--- license: openrail ---
kiviki/SlovakSum
--- license: openrail --- The SlovakSum dataset from the SlovakSum: Slovak News Summarization Dataset paper
joey234/mmlu-computer_security-neg
--- dataset_info: features: - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: question dtype: string splits: - name: test num_bytes: 24710 num_examples: 100 download_size: 17476 dataset_size: 24710 --- # Dataset Card for "mmlu-computer_security-neg" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
irds/aquaint_trec-robust-2005
--- pretty_name: '`aquaint/trec-robust-2005`' viewer: false source_datasets: ['irds/aquaint'] task_categories: - text-retrieval --- # Dataset Card for `aquaint/trec-robust-2005` The `aquaint/trec-robust-2005` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/aquaint#aquaint/trec-robust-2005). # Data This dataset provides: - `queries` (i.e., topics); count=50 - `qrels`: (relevance assessments); count=37,798 - For `docs`, use [`irds/aquaint`](https://huggingface.co/datasets/irds/aquaint) ## Usage ```python from datasets import load_dataset queries = load_dataset('irds/aquaint_trec-robust-2005', 'queries') for record in queries: record # {'query_id': ..., 'title': ..., 'description': ..., 'narrative': ...} qrels = load_dataset('irds/aquaint_trec-robust-2005', 'qrels') for record in qrels: record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format. ## Citation Information ``` @inproceedings{Voorhees2005Robust, title={Overview of the TREC 2005 Robust Retrieval Track}, author={Ellen M. Voorhees}, booktitle={TREC}, year={2005} } @misc{Graff2002Aquaint, title={The AQUAINT Corpus of English News Text}, author={David Graff}, year={2002}, url={https://catalog.ldc.upenn.edu/LDC2002T31}, publisher={Linguistic Data Consortium} } ```
abdulhade/KurdishTextCorpus
--- license: afl-3.0 --- کۆڕپسێکی کۆکراوەی دەقی کوردی ناوەڕاست(سۆرانیە) کە قەبارەکەی پێکدێت لە ٤٣٠ میگا بایت
klima7/en-pl-translation
--- license: odbl task_categories: - translation language: - pl - en --- This dataset was created by translating part of [en-fr-translation-dataset](https://www.kaggle.com/datasets/dhruvildave/en-fr-translation-dataset) using [Argos Translate](https://github.com/argosopentech/argos-translate).
2OP/market-data
--- license: openrail task_categories: - summarization - text2text-generation language: - en pretty_name: u ---
open-llm-leaderboard/details_PulsarAI__CollectiveCognition-v1.1-Nebula-7B
--- pretty_name: Evaluation run of PulsarAI/CollectiveCognition-v1.1-Nebula-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PulsarAI/CollectiveCognition-v1.1-Nebula-7B](https://huggingface.co/PulsarAI/CollectiveCognition-v1.1-Nebula-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__CollectiveCognition-v1.1-Nebula-7B_public\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-11-12T21:42:17.063541](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__CollectiveCognition-v1.1-Nebula-7B_public/blob/main/results_2023-11-12T21-42-17.063541.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5655902624582015,\n\ \ \"acc_stderr\": 0.033540567370804734,\n \"acc_norm\": 0.5747445580416879,\n\ \ \"acc_norm_stderr\": 0.03431067576831402,\n \"mc1\": 0.38555691554467564,\n\ \ \"mc1_stderr\": 0.01703883901059167,\n \"mc2\": 0.5353024010333743,\n\ \ \"mc2_stderr\": 0.015743888224866397,\n \"em\": 0.35675335570469796,\n\ \ \"em_stderr\": 0.004905829488253491,\n \"f1\": 0.4216977768456382,\n\ \ \"f1_stderr\": 0.0047367493845716785\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.014580637569995421,\n\ \ \"acc_norm\": 0.5810580204778157,\n \"acc_norm_stderr\": 0.014418106953639013\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6309500099581756,\n\ \ \"acc_stderr\": 0.004815613144385404,\n \"acc_norm\": 0.8239394542919737,\n\ \ \"acc_norm_stderr\": 0.0038009327705977565\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\ \ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\ \ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n\ \ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\ \ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \ \ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286623,\n\ \ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286623\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\ \ \"acc_stderr\": 0.040329990539607175,\n \"acc_norm\": 0.6319444444444444,\n\ \ \"acc_norm_stderr\": 0.040329990539607175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\ \ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\ \ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\ \ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\ \ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\ \ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936337,\n\ \ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936337\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\ \ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\ \ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885108,\n \"\ acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885108\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\ \ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\ \ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\ \ \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.6483870967741936,\n\ \ \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\ \ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\ : 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\ \ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\ acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454806,\n\ \ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454806\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868592,\n\ \ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868592\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \ \ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \ \ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7614678899082569,\n \"acc_stderr\": 0.018272575810231867,\n \"\ acc_norm\": 0.7614678899082569,\n \"acc_norm_stderr\": 0.018272575810231867\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\ acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7205882352941176,\n \"acc_stderr\": 0.03149328104507957,\n \"\ acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.03149328104507957\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \ \ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\ \ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\ \ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\ \ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\ acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\ \ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\ \ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\ \ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\ \ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\ \ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260597,\n\ \ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260597\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\ \ \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.811965811965812,\n\ \ \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \ \ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\ \ \"acc_stderr\": 0.015016884698539892,\n \"acc_norm\": 0.7713920817369093,\n\ \ \"acc_norm_stderr\": 0.015016884698539892\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.0261521986197268,\n\ \ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.0261521986197268\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22793296089385476,\n\ \ \"acc_stderr\": 0.014030149950805098,\n \"acc_norm\": 0.22793296089385476,\n\ \ \"acc_norm_stderr\": 0.014030149950805098\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\ \ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\ \ \"acc_stderr\": 0.027368078243971646,\n \"acc_norm\": 0.6334405144694534,\n\ \ \"acc_norm_stderr\": 0.027368078243971646\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\ \ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \ \ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n\ \ \"acc_stderr\": 0.012650007999463888,\n \"acc_norm\": 0.4315514993481095,\n\ \ \"acc_norm_stderr\": 0.012650007999463888\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555033,\n\ \ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555033\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6078431372549019,\n \"acc_stderr\": 0.019751726508762637,\n \ \ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.019751726508762637\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5755102040816327,\n \"acc_stderr\": 0.031642094879429414,\n\ \ \"acc_norm\": 0.5755102040816327,\n \"acc_norm_stderr\": 0.031642094879429414\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\ \ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\ \ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \ \ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\ \ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n\ \ \"mc1_stderr\": 0.01703883901059167,\n \"mc2\": 0.5353024010333743,\n\ \ \"mc2_stderr\": 0.015743888224866397\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.012370922527262008\n\ \ },\n \"harness|drop|3\": {\n \"em\": 0.35675335570469796,\n \ \ \"em_stderr\": 0.004905829488253491,\n \"f1\": 0.4216977768456382,\n\ \ \"f1_stderr\": 0.0047367493845716785\n },\n \"harness|gsm8k|5\":\ \ {\n \"acc\": 0.09552691432903715,\n \"acc_stderr\": 0.008096605771155759\n\ \ }\n}\n```" repo_url: https://huggingface.co/PulsarAI/CollectiveCognition-v1.1-Nebula-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|arc:challenge|25_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-11-12T21-42-17.063541.parquet' - config_name: harness_drop_3 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|drop|3_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|drop|3_2023-11-12T21-42-17.063541.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|gsm8k|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hellaswag|10_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-12T21-42-17.063541.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-management|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-virology|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T21-42-17.063541.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|truthfulqa:mc|0_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-11-12T21-42-17.063541.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_11_12T21_42_17.063541 path: - '**/details_harness|winogrande|5_2023-11-12T21-42-17.063541.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-11-12T21-42-17.063541.parquet' - config_name: results data_files: - split: 2023_11_12T21_42_17.063541 path: - results_2023-11-12T21-42-17.063541.parquet - split: latest path: - results_2023-11-12T21-42-17.063541.parquet --- # Dataset Card for Evaluation run of PulsarAI/CollectiveCognition-v1.1-Nebula-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PulsarAI/CollectiveCognition-v1.1-Nebula-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PulsarAI/CollectiveCognition-v1.1-Nebula-7B](https://huggingface.co/PulsarAI/CollectiveCognition-v1.1-Nebula-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PulsarAI__CollectiveCognition-v1.1-Nebula-7B_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-12T21:42:17.063541](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__CollectiveCognition-v1.1-Nebula-7B_public/blob/main/results_2023-11-12T21-42-17.063541.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5655902624582015, "acc_stderr": 0.033540567370804734, "acc_norm": 0.5747445580416879, "acc_norm_stderr": 0.03431067576831402, "mc1": 0.38555691554467564, "mc1_stderr": 0.01703883901059167, "mc2": 0.5353024010333743, "mc2_stderr": 0.015743888224866397, "em": 0.35675335570469796, "em_stderr": 0.004905829488253491, "f1": 0.4216977768456382, "f1_stderr": 0.0047367493845716785 }, "harness|arc:challenge|25": { "acc": 0.5324232081911263, "acc_stderr": 0.014580637569995421, "acc_norm": 0.5810580204778157, "acc_norm_stderr": 0.014418106953639013 }, "harness|hellaswag|10": { "acc": 0.6309500099581756, "acc_stderr": 0.004815613144385404, "acc_norm": 0.8239394542919737, "acc_norm_stderr": 0.0038009327705977565 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5986842105263158, "acc_stderr": 0.03988903703336284, "acc_norm": 0.5986842105263158, "acc_norm_stderr": 0.03988903703336284 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6188679245283019, "acc_stderr": 0.029890609686286623, "acc_norm": 0.6188679245283019, "acc_norm_stderr": 0.029890609686286623 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6319444444444444, "acc_stderr": 0.040329990539607175, "acc_norm": 0.6319444444444444, "acc_norm_stderr": 0.040329990539607175 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207763, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207763 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.49361702127659574, "acc_stderr": 0.03268335899936337, "acc_norm": 0.49361702127659574, "acc_norm_stderr": 0.03268335899936337 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3915343915343915, "acc_stderr": 0.02513809138885108, "acc_norm": 0.3915343915343915, "acc_norm_stderr": 0.02513809138885108 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.04343525428949098, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.04343525428949098 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6483870967741936, "acc_stderr": 0.027162537826948458, "acc_norm": 0.6483870967741936, "acc_norm_stderr": 0.027162537826948458 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.45320197044334976, "acc_stderr": 0.03502544650845872, "acc_norm": 0.45320197044334976, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885417, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885417 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8238341968911918, "acc_stderr": 0.02749350424454806, "acc_norm": 0.8238341968911918, "acc_norm_stderr": 0.02749350424454806 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5615384615384615, "acc_stderr": 0.025158266016868592, "acc_norm": 0.5615384615384615, "acc_norm_stderr": 0.025158266016868592 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085626, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085626 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5588235294117647, "acc_stderr": 0.0322529423239964, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.0322529423239964 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7614678899082569, "acc_stderr": 0.018272575810231867, "acc_norm": 0.7614678899082569, "acc_norm_stderr": 0.018272575810231867 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39351851851851855, "acc_stderr": 0.03331747876370312, "acc_norm": 0.39351851851851855, "acc_norm_stderr": 0.03331747876370312 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7205882352941176, "acc_stderr": 0.03149328104507957, "acc_norm": 0.7205882352941176, "acc_norm_stderr": 0.03149328104507957 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.729957805907173, "acc_stderr": 0.028900721906293426, "acc_norm": 0.729957805907173, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776679, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776679 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6564885496183206, "acc_stderr": 0.041649760719448786, "acc_norm": 0.6564885496183206, "acc_norm_stderr": 0.041649760719448786 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070417, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6944444444444444, "acc_stderr": 0.044531975073749834, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.044531975073749834 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6871165644171779, "acc_stderr": 0.036429145782924055, "acc_norm": 0.6871165644171779, "acc_norm_stderr": 0.036429145782924055 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.36607142857142855, "acc_stderr": 0.0457237235873743, "acc_norm": 0.36607142857142855, "acc_norm_stderr": 0.0457237235873743 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260597, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260597 }, "harness|hendrycksTest-marketing|5": { "acc": 0.811965811965812, "acc_stderr": 0.025598193686652265, "acc_norm": 0.811965811965812, "acc_norm_stderr": 0.025598193686652265 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7713920817369093, "acc_stderr": 0.015016884698539892, "acc_norm": 0.7713920817369093, "acc_norm_stderr": 0.015016884698539892 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6184971098265896, "acc_stderr": 0.0261521986197268, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.0261521986197268 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.22793296089385476, "acc_stderr": 0.014030149950805098, "acc_norm": 0.22793296089385476, "acc_norm_stderr": 0.014030149950805098 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6405228758169934, "acc_stderr": 0.027475969910660952, "acc_norm": 0.6405228758169934, "acc_norm_stderr": 0.027475969910660952 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6334405144694534, "acc_stderr": 0.027368078243971646, "acc_norm": 0.6334405144694534, "acc_norm_stderr": 0.027368078243971646 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6820987654320988, "acc_stderr": 0.02591006352824088, "acc_norm": 0.6820987654320988, "acc_norm_stderr": 0.02591006352824088 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.42907801418439717, "acc_stderr": 0.02952591430255856, "acc_norm": 0.42907801418439717, "acc_norm_stderr": 0.02952591430255856 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4315514993481095, "acc_stderr": 0.012650007999463888, "acc_norm": 0.4315514993481095, "acc_norm_stderr": 0.012650007999463888 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5257352941176471, "acc_stderr": 0.030332578094555033, "acc_norm": 0.5257352941176471, "acc_norm_stderr": 0.030332578094555033 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6078431372549019, "acc_stderr": 0.019751726508762637, "acc_norm": 0.6078431372549019, "acc_norm_stderr": 0.019751726508762637 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5755102040816327, "acc_stderr": 0.031642094879429414, "acc_norm": 0.5755102040816327, "acc_norm_stderr": 0.031642094879429414 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7512437810945274, "acc_stderr": 0.030567675938916718, "acc_norm": 0.7512437810945274, "acc_norm_stderr": 0.030567675938916718 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.8, "acc_stderr": 0.04020151261036845, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.038899512528272166, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.03158149539338734, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.38555691554467564, "mc1_stderr": 0.01703883901059167, "mc2": 0.5353024010333743, "mc2_stderr": 0.015743888224866397 }, "harness|winogrande|5": { "acc": 0.7371744277821626, "acc_stderr": 0.012370922527262008 }, "harness|drop|3": { "em": 0.35675335570469796, "em_stderr": 0.004905829488253491, "f1": 0.4216977768456382, "f1_stderr": 0.0047367493845716785 }, "harness|gsm8k|5": { "acc": 0.09552691432903715, "acc_stderr": 0.008096605771155759 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
d0rj/RuBQ_1.0
--- configs: - config_name: default data_files: - split: test path: data/test-* - split: dev path: data/dev-* dataset_info: features: - name: uid dtype: int64 - name: question_text dtype: string - name: query dtype: string - name: answer_text dtype: string - name: question_uris sequence: string - name: question_props sequence: string - name: answers list: - name: datatype dtype: string - name: type dtype: string - name: value dtype: string - name: xml:lang dtype: string - name: tags sequence: string - name: question_eng dtype: string splits: - name: test num_bytes: 472281 num_examples: 1200 - name: dev num_bytes: 115029 num_examples: 300 download_size: 249954 dataset_size: 587310 license: cc-by-sa-4.0 task_categories: - question-answering language: - ru - en tags: - qa - machine reading source_datasets: - original pretty_name: RuBQ 1.0 size_categories: - 1K<n<10K paperswithcode_id: rubq --- # RuBQ 1.0 ## Dataset Description - **Repository:** https://github.com/vladislavneon/RuBQ/tree/master/RuBQ_1.0 - **Paper:** [RuBQ: A Russian Dataset for Question Answering over Wikidata](https://arxiv.org/abs/2005.10659)
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-136000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 6147896 num_examples: 461 download_size: 365464 dataset_size: 6147896 configs: - config_name: default data_files: - split: train path: data/train-* ---
nlplabtdtu/data-synthetic
--- dataset_info: features: - name: text dtype: string - name: QAs list: - name: Q dtype: string - name: A dtype: string - name: summary dtype: string splits: - name: train1 num_bytes: 37222547 num_examples: 2064 - name: train2 num_bytes: 87690645 num_examples: 4484 download_size: 58238796 dataset_size: 124913192 configs: - config_name: default data_files: - split: train1 path: data/train1-* - split: train2 path: data/train2-* ---
autoevaluate/autoeval-eval-lener_br-lener_br-bd0c63-1886364291
--- type: predictions tags: - autotrain - evaluation datasets: - lener_br eval_info: task: entity_extraction model: Luciano/xlm-roberta-base-finetuned-lener_br-finetuned-lener-br metrics: [] dataset_name: lener_br dataset_config: lener_br dataset_split: test col_mapping: tokens: tokens tags: ner_tags --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Token Classification * Model: Luciano/xlm-roberta-base-finetuned-lener_br-finetuned-lener-br * Dataset: lener_br * Config: lener_br * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model.
autoevaluate/autoeval-eval-adversarial_qa-adversarialQA-f2f24c-49139145269
--- type: predictions tags: - autotrain - evaluation datasets: - adversarial_qa eval_info: task: extractive_question_answering model: 123tarunanand/roberta-base-finetuned metrics: ['bleu', 'accuracy', 'angelina-wang/directional_bias_amplification', 'bertscore', 'rouge', 'meteor'] dataset_name: adversarial_qa dataset_config: adversarialQA dataset_split: validation col_mapping: context: context question: question answers-text: answers.text answers-answer_start: answers.answer_start --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Question Answering * Model: 123tarunanand/roberta-base-finetuned * Dataset: adversarial_qa * Config: adversarialQA * Split: validation To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@sysneedtolearn](https://huggingface.co/sysneedtolearn) for evaluating this model.
giacomog96/Tesi
--- license: unlicense ---
reach-vb/mls-eng-10k-repunct-test-spacy-v1
--- dataset_info: features: - name: original_path dtype: string - name: begin_time dtype: float64 - name: end_time dtype: float64 - name: transcript dtype: string - name: audio_duration dtype: float64 - name: speaker_id dtype: string - name: book_id dtype: string - name: repunct_text dtype: string splits: - name: dev num_bytes: 2183089 num_examples: 3807 download_size: 1220542 dataset_size: 2183089 configs: - config_name: default data_files: - split: dev path: data/dev-* ---
ap539813/diabetes-llama2-wiki
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 36430 num_examples: 143 download_size: 18771 dataset_size: 36430 configs: - config_name: default data_files: - split: train path: data/train-* ---
ll-13/GLH-Bridge
--- license: mit ---
lhallee/MetalIonBinding_reg
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: valid path: data/valid-* - split: test path: data/test-* dataset_info: features: - name: seqs dtype: string - name: labels dtype: float64 splits: - name: train num_bytes: 1561586 num_examples: 5068 - name: valid num_bytes: 205883 num_examples: 662 - name: test num_bytes: 197893 num_examples: 665 download_size: 1600987 dataset_size: 1965362 --- # Dataset Card for "MetalIonBinding_reg" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)