datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
yuan-sf63/word_label_0.2_72_D
--- dataset_info: features: - name: text dtype: string - name: '0' dtype: int64 - name: '1' dtype: int64 - name: '2' dtype: int64 - name: '3' dtype: int64 - name: '4' dtype: int64 - name: '5' dtype: int64 - name: '6' dtype: int64 - name: '7' dtype: int64 - name: '8' dtype: int64 - name: '9' dtype: int64 - name: '10' dtype: int64 - name: '11' dtype: int64 - name: '12' dtype: int64 - name: '13' dtype: int64 - name: '14' dtype: int64 - name: '15' dtype: int64 - name: '16' dtype: int64 - name: '17' dtype: int64 - name: '18' dtype: int64 - name: '19' dtype: int64 - name: '20' dtype: int64 - name: '21' dtype: int64 - name: '22' dtype: int64 - name: '23' dtype: int64 - name: '24' dtype: int64 - name: '25' dtype: int64 - name: '26' dtype: int64 - name: '27' dtype: int64 - name: '28' dtype: int64 - name: '29' dtype: int64 - name: '30' dtype: int64 - name: '31' dtype: int64 - name: '32' dtype: int64 - name: '33' dtype: int64 - name: '34' dtype: int64 - name: '35' dtype: int64 - name: '36' dtype: int64 - name: '37' dtype: int64 - name: '38' dtype: int64 - name: '39' dtype: int64 - name: '40' dtype: int64 - name: '41' dtype: int64 - name: '42' dtype: int64 - name: '43' dtype: int64 - name: '44' dtype: int64 - name: '45' dtype: int64 - name: '46' dtype: int64 - name: '47' dtype: int64 - name: '48' dtype: int64 - name: '49' dtype: int64 - name: '50' dtype: int64 - name: '51' dtype: int64 - name: '52' dtype: int64 - name: '53' dtype: int64 - name: '54' dtype: int64 - name: '55' dtype: int64 - name: '56' dtype: int64 - name: '57' dtype: int64 - name: '58' dtype: int64 - name: '59' dtype: int64 - name: '60' dtype: int64 - name: '61' dtype: int64 - name: '62' dtype: int64 - name: '63' dtype: int64 - name: '64' dtype: int64 - name: '65' dtype: int64 - name: '66' dtype: int64 - name: '67' dtype: int64 - name: '68' dtype: int64 - name: '69' dtype: int64 - name: '70' dtype: int64 - name: '71' dtype: int64 splits: - name: train num_bytes: 48978649.8 num_examples: 71901 - name: validation num_bytes: 5442072.2 num_examples: 7989 download_size: 8684772 dataset_size: 54420722.0 --- # Dataset Card for "word_label_0.2_72_D" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
silk-road/Haruhi-Zero
--- license: cc-by-4.0 --- # 用于ChatHaruhi-Zero Extend的训练数据 目前还不知道数据规模 知道的话回头会更名为Haruhi-Zero-XXX K 目前只放出每个source的sample,完整的数据将在1.0 模型放出之后发布 主项目链接 https://github.com/LC1332/Chat-Haruhi-Suzumiya 如果有兴趣加入我们的训练请联系chengli.thu@gmail.com 计划加入的数据源 数据源 - [x] 中文小说数据 - [x] erotics小说数据 - [x] ChatHaruhi 52K, (转了message格式) - [x] Chinese 13.2k, 转了message格式) - [x] Waifu-extended 0.2K, 看看方不方便转成message格式,不行就简单的user-AI - [x] Claude-Baize数据 7.2K - [x] PIPPA数据 1.68K - [x] JanitorAI数据 - [ ] PIPPA翻译数据 - [x] RoleLLM 1.6K, 看看方不方便转成message格式,不行就简单的user-AI # 0.2 进一步去掉AI助理的相关数据 # 0.3 增加身份认知数据 # 0.4 增加小说抽取数据 # 0.5 增加PIPPA翻译,小说数据增加profile ## 赞助 求捐助Claude API 求捐助OpenAI企业API 求赞助资源计算资源中。。。
a686d380/h-corpus-raw
--- viewer: false language: - zh --- 未清洗的中文H小说 | 数据| 文章数| 解压后大小 | 来源 | 质量 | 备注| |- | - |- | - | - | - | |jjsw | 73,432 | 4.0 GB | 禁忌书屋 | 高 | - | |pixiv-selected | 2,935 | 174.3 MB | pixiv排行版 | 高 | - | |shubao | 6,776 |1.6 GB | 网络 | 低 | - | |sis-long | 4,555 | 3.5 GB | sis | 中 | - | |sis-short | 111,237 | 4.1 GB | sis | 中 | - | |xbookcn | 39,798 | 1.0 GB | xbookcn | 高 | - | |xhs | 38,406 | 8.6 GB | 网络 | 中 | - | |zyd2023 | 3,935 | 3.8 GB | 网络 | 中 | - | 仅供科学研究使用!
Gabriel1322/flokos
--- license: openrail ---
DialogueCharacter/english_preference_hh_helpful_unfiltered
--- dataset_info: features: - name: chosen dtype: string - name: rejected dtype: string splits: - name: train num_bytes: 261122999 num_examples: 124503 download_size: 147966858 dataset_size: 261122999 --- # Dataset Card for "english_preference_hh_helpful_unfiltered" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
looppayments/question_answering_token_classification_addendum
--- pretty_name: Question Answering Token Classification --- Total train samples: 168397 Total test samples: 49233 Total tasks: 7 | Task | Train | Test | | ---- | ----- | ---- | |reference_number_association_without_question_boxes/2023-01-01|11481|3756| |reference_numbers/2023-01-01|12739|3974| |reference_number_association_with_question_boxes/2023-01-01|11481|3756| |table_cell_incremental_without_question_boxes/2023-01-01|22884|10566| |table_cell_incremental_with_question_boxes/2023-01-01|17986|6079| |table_header_with_question_boxes/2023-01-01|80278|17362| |key_value/2023-01-01|11548|3740| Total artifact_qids: 15860
chuyin0321/perimeter-sp500
--- dataset_info: features: - name: symbol dtype: string - name: security dtype: string - name: gics_sector dtype: string - name: gics_sub_industry dtype: string splits: - name: train num_bytes: 35417 num_examples: 503 download_size: 15479 dataset_size: 35417 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "perimeter-sp500" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
pinkieseb/auslegalcorpus
--- license: apache-2.0 language: - en tags: - legal size_categories: - 1M<n<10M ---
LocalDoc/wikipedia_azerbaijan
--- language: - az license: cc-by-nc-nd-4.0 size_categories: - 100K<n<1M task_categories: - text-generation - fill-mask pretty_name: Azerbaijani Wikipedia Dataset tags: - wikipedia - azerbaijani - dataset - csv dataset_info: features: - name: '!' dtype: string - name: >- Nida işarəsi (!) — Aşağıdakı hallarda işlədilən durğu işarəsi: Nida cümləsinin sonunda. Məsələn: Azərbaycan dilində /Yanğın!/, /Fəlakət!/; əmr cümlələrində /Rədd ol burdan!/; Çağırış və müraciət həyəcanlı olanda. Məsələn: Azərbaycan dilində /Yaşasın müstəqil Azərbaycan!//; Nida cümlələrində özəksonu zəifləyir, zaman ləngiyir. /Ana! O, müqəddəs bir kainatdır//. dtype: string - name: https://az.wikipedia.org/w/index.php?curid=259941 dtype: string splits: - name: train num_bytes: 696233998 num_examples: 260011 download_size: 271389268 dataset_size: 696233998 configs: - config_name: default data_files: - split: train path: data/train-* --- <h2>Azerbaijani Wikipedia Dataset</h2> Description This dataset contains all articles from Wikipedia in Azerbaijani language. It was created in 2024 and contains 260k articles. Format The dataset is provided in comma-separated values (CSV) format. Each article is represented on a new line with the following fields separated by commas: title: Title of the article text: Text of the article url: URL of the article License The dataset is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International license. This license allows you to freely share and redistribute the dataset with attribution to the source but prohibits commercial use and the creation of derivative works. Contact information If you have any questions or suggestions, please contact us at [v.resad.89@gmail.com].
mariosasko/test_dataset
--- configs: - config_name: main data_files: "data.parquet" - config_name: dev data_files: "data.parquet" ---
Aishwarya30998/updated_alpaca_dataset_3k
--- language: - en dataset_info: features: - name: output dtype: string - name: input dtype: string - name: instruction dtype: string splits: - name: train num_bytes: 913431.7910849582 num_examples: 2500 - name: validation num_bytes: 91343.17910849582 num_examples: 250 - name: test num_bytes: 91343.17910849582 num_examples: 250 download_size: 692518 dataset_size: 1096118.14930195 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
relhousieny/tokenized_lamini_template
--- license: apache-2.0 dataset_info: features: - name: question dtype: string - name: answer dtype: string - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: train num_bytes: 2253501 num_examples: 1400 download_size: 692821 dataset_size: 2253501 configs: - config_name: default data_files: - split: train path: data/train-* ---
gagan3012/miracl-ar
--- dataset_info: features: - name: query dtype: string - name: positive sequence: string - name: negative sequence: string splits: - name: test num_bytes: 36656588 num_examples: 2896 download_size: 17999656 dataset_size: 36656588 configs: - config_name: default data_files: - split: test path: data/test-* ---
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/942ab115
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 178 num_examples: 10 download_size: 1314 dataset_size: 178 --- # Dataset Card for "942ab115" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
irds/nfcorpus_train_nontopic
--- pretty_name: '`nfcorpus/train/nontopic`' viewer: false source_datasets: ['irds/nfcorpus'] task_categories: - text-retrieval --- # Dataset Card for `nfcorpus/train/nontopic` The `nfcorpus/train/nontopic` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/nfcorpus#nfcorpus/train/nontopic). # Data This dataset provides: - `queries` (i.e., topics); count=1,141 - `qrels`: (relevance assessments); count=37,383 - For `docs`, use [`irds/nfcorpus`](https://huggingface.co/datasets/irds/nfcorpus) ## Usage ```python from datasets import load_dataset queries = load_dataset('irds/nfcorpus_train_nontopic', 'queries') for record in queries: record # {'query_id': ..., 'text': ...} qrels = load_dataset('irds/nfcorpus_train_nontopic', 'qrels') for record in qrels: record # {'query_id': ..., 'doc_id': ..., 'relevance': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format. ## Citation Information ``` @inproceedings{Boteva2016Nfcorpus, title="A Full-Text Learning to Rank Dataset for Medical Information Retrieval", author = "Vera Boteva and Demian Gholipour and Artem Sokolov and Stefan Riezler", booktitle = "Proceedings of the European Conference on Information Retrieval ({ECIR})", location = "Padova, Italy", publisher = "Springer", year = 2016 } ```
open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B
--- pretty_name: Evaluation run of beowolx/CodeNinja-1.0-OpenChat-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [beowolx/CodeNinja-1.0-OpenChat-7B](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-30T00:15:01.452444](https://huggingface.co/datasets/open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B/blob/main/results_2023-12-30T00-15-01.452444.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6402511076573505,\n\ \ \"acc_stderr\": 0.03227558523986021,\n \"acc_norm\": 0.64104515726518,\n\ \ \"acc_norm_stderr\": 0.03293549849526372,\n \"mc1\": 0.31334149326805383,\n\ \ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.47161024077294855,\n\ \ \"mc2_stderr\": 0.014885155226330158\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735565,\n\ \ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.0140702655192688\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6398127862975503,\n\ \ \"acc_stderr\": 0.0047907346837045865,\n \"acc_norm\": 0.8364867556263692,\n\ \ \"acc_norm_stderr\": 0.0036907745636380125\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\ \ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\ \ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\ \ \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n\ \ \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\ \ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\ \ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\ \ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\ \ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\ \ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\ \ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\ \ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778415,\n \"\ acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778415\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\ \ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\ \ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"\ acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\ acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\ : 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\ acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768756,\n\ \ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768756\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094753,\n\ \ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094753\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \ \ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \ \ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846177,\n \"\ acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846177\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\ acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\ acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \ \ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.038448761397852714,\n\ \ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.038448761397852714\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\ acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\ \ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\ \ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\ \ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\ \ \"acc_stderr\": 0.022209309073165623,\n \"acc_norm\": 0.8675213675213675,\n\ \ \"acc_norm_stderr\": 0.022209309073165623\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\ \ \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n\ \ \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n\ \ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\ \ \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n\ \ \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\ \ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\ \ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\ \ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\ \ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \ \ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n\ \ \"acc_stderr\": 0.012753716929101001,\n \"acc_norm\": 0.4745762711864407,\n\ \ \"acc_norm_stderr\": 0.012753716929101001\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\ \ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \ \ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\ \ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\ \ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\ \ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\ \ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\ \ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\ \ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\ \ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.47161024077294855,\n\ \ \"mc2_stderr\": 0.014885155226330158\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047444\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.66565579984837,\n \ \ \"acc_stderr\": 0.012994634003332766\n }\n}\n```" repo_url: https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|arc:challenge|25_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-30T00-15-01.452444.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|gsm8k|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hellaswag|10_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-15-01.452444.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-management|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-15-01.452444.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|truthfulqa:mc|0_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-30T00-15-01.452444.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_30T00_15_01.452444 path: - '**/details_harness|winogrande|5_2023-12-30T00-15-01.452444.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-30T00-15-01.452444.parquet' - config_name: results data_files: - split: 2023_12_30T00_15_01.452444 path: - results_2023-12-30T00-15-01.452444.parquet - split: latest path: - results_2023-12-30T00-15-01.452444.parquet --- # Dataset Card for Evaluation run of beowolx/CodeNinja-1.0-OpenChat-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [beowolx/CodeNinja-1.0-OpenChat-7B](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-30T00:15:01.452444](https://huggingface.co/datasets/open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B/blob/main/results_2023-12-30T00-15-01.452444.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6402511076573505, "acc_stderr": 0.03227558523986021, "acc_norm": 0.64104515726518, "acc_norm_stderr": 0.03293549849526372, "mc1": 0.31334149326805383, "mc1_stderr": 0.016238065069059605, "mc2": 0.47161024077294855, "mc2_stderr": 0.014885155226330158 }, "harness|arc:challenge|25": { "acc": 0.6015358361774744, "acc_stderr": 0.014306946052735565, "acc_norm": 0.6348122866894198, "acc_norm_stderr": 0.0140702655192688 }, "harness|hellaswag|10": { "acc": 0.6398127862975503, "acc_stderr": 0.0047907346837045865, "acc_norm": 0.8364867556263692, "acc_norm_stderr": 0.0036907745636380125 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.03800968060554859, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.03800968060554859 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.04793724854411018, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411018 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778415, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778415 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8064516129032258, "acc_stderr": 0.022475258525536057, "acc_norm": 0.8064516129032258, "acc_norm_stderr": 0.022475258525536057 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.02985751567338642, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.02985751567338642 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768756, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768756 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6358974358974359, "acc_stderr": 0.024396672985094753, "acc_norm": 0.6358974358974359, "acc_norm_stderr": 0.024396672985094753 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683512, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683512 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.01570349834846177, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.01570349834846177 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7892156862745098, "acc_stderr": 0.028626547912437406, "acc_norm": 0.7892156862745098, "acc_norm_stderr": 0.028626547912437406 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7805907172995781, "acc_stderr": 0.026939106581553945, "acc_norm": 0.7805907172995781, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.031024411740572213, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.031024411740572213 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.038448761397852714, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.038448761397852714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.036756688322331886, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.036756688322331886 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165623, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165623 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8122605363984674, "acc_stderr": 0.013964393769899133, "acc_norm": 0.8122605363984674, "acc_norm_stderr": 0.013964393769899133 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.023948512905468348, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.023948512905468348 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3027932960893855, "acc_stderr": 0.015366860386397112, "acc_norm": 0.3027932960893855, "acc_norm_stderr": 0.015366860386397112 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.025403832978179615, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.025403832978179615 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7098765432098766, "acc_stderr": 0.025251173936495036, "acc_norm": 0.7098765432098766, "acc_norm_stderr": 0.025251173936495036 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4745762711864407, "acc_stderr": 0.012753716929101001, "acc_norm": 0.4745762711864407, "acc_norm_stderr": 0.012753716929101001 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6580882352941176, "acc_stderr": 0.028814722422254184, "acc_norm": 0.6580882352941176, "acc_norm_stderr": 0.028814722422254184 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.019270998708223977, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.019270998708223977 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.02519692987482706, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.02519692987482706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.02796678585916089, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.02796678585916089 }, "harness|truthfulqa:mc|0": { "mc1": 0.31334149326805383, "mc1_stderr": 0.016238065069059605, "mc2": 0.47161024077294855, "mc2_stderr": 0.014885155226330158 }, "harness|winogrande|5": { "acc": 0.797947908445146, "acc_stderr": 0.011285013754047444 }, "harness|gsm8k|5": { "acc": 0.66565579984837, "acc_stderr": 0.012994634003332766 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
waleko/unarXive-en2ru
--- license: - cc-by-sa-4.0 task_categories: - translation language: - en - ru size_categories: - 10K<n<100K source_datasets: - saier/unarXive_citrec annotations_creators: - machine-generated tags: - arXiv.org - arXiv - publication - paper - preprint - section - physics - mathematics - computer science - cs - machine translation - translation --- # Dataset Card for unarXive-en2ru This dataset contains text excerpts from the [unarXive citation recommendation](https://huggingface.co/datasets/saier/unarXive_citrec) dataset along with their translations into Russian. The translations have been obtained using [OpenAI GPT-3.5-Turbo](https://platform.openai.com/). The dataset is intended for machine translation research.
zapp926/psyEntryDataset
--- task_categories: - question-answering language: - zh size_categories: - 10K<n<100K ---
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_124
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1518648264.0 num_examples: 298242 download_size: 1550717650 dataset_size: 1518648264.0 --- # Dataset Card for "chunk_124" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Anthony-Tafoya/CommitPackSV
--- license: apache-2.0 ---
open-llm-leaderboard/details_nthngdy__pythia-owt2-70m-100k
--- pretty_name: Evaluation run of nthngdy/pythia-owt2-70m-100k dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [nthngdy/pythia-owt2-70m-100k](https://huggingface.co/nthngdy/pythia-owt2-70m-100k)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nthngdy__pythia-owt2-70m-100k\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-16T16:58:18.157087](https://huggingface.co/datasets/open-llm-leaderboard/details_nthngdy__pythia-owt2-70m-100k/blob/main/results_2023-09-16T16-58-18.157087.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01960989932885906,\n\ \ \"em_stderr\": 0.00141996222824606,\n \"f1\": 0.0546665268456376,\n\ \ \"f1_stderr\": 0.0018294405855806455,\n \"acc\": 0.26637726913970006,\n\ \ \"acc_stderr\": 0.007011150285217067\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.01960989932885906,\n \"em_stderr\": 0.00141996222824606,\n\ \ \"f1\": 0.0546665268456376,\n \"f1_stderr\": 0.0018294405855806455\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5327545382794001,\n\ \ \"acc_stderr\": 0.014022300570434134\n }\n}\n```" repo_url: https://huggingface.co/nthngdy/pythia-owt2-70m-100k leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|arc:challenge|25_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-07-19T13:34:55.847761.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_16T16_58_18.157087 path: - '**/details_harness|drop|3_2023-09-16T16-58-18.157087.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-16T16-58-18.157087.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_16T16_58_18.157087 path: - '**/details_harness|gsm8k|5_2023-09-16T16-58-18.157087.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-16T16-58-18.157087.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hellaswag|10_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:34:55.847761.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-management|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:34:55.847761.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_07_19T13_34_55.847761 path: - '**/details_harness|truthfulqa:mc|0_2023-07-19T13:34:55.847761.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-07-19T13:34:55.847761.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_16T16_58_18.157087 path: - '**/details_harness|winogrande|5_2023-09-16T16-58-18.157087.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-16T16-58-18.157087.parquet' - config_name: results data_files: - split: 2023_07_19T13_34_55.847761 path: - results_2023-07-19T13:34:55.847761.parquet - split: 2023_09_16T16_58_18.157087 path: - results_2023-09-16T16-58-18.157087.parquet - split: latest path: - results_2023-09-16T16-58-18.157087.parquet --- # Dataset Card for Evaluation run of nthngdy/pythia-owt2-70m-100k ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/nthngdy/pythia-owt2-70m-100k - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [nthngdy/pythia-owt2-70m-100k](https://huggingface.co/nthngdy/pythia-owt2-70m-100k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_nthngdy__pythia-owt2-70m-100k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-16T16:58:18.157087](https://huggingface.co/datasets/open-llm-leaderboard/details_nthngdy__pythia-owt2-70m-100k/blob/main/results_2023-09-16T16-58-18.157087.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.01960989932885906, "em_stderr": 0.00141996222824606, "f1": 0.0546665268456376, "f1_stderr": 0.0018294405855806455, "acc": 0.26637726913970006, "acc_stderr": 0.007011150285217067 }, "harness|drop|3": { "em": 0.01960989932885906, "em_stderr": 0.00141996222824606, "f1": 0.0546665268456376, "f1_stderr": 0.0018294405855806455 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5327545382794001, "acc_stderr": 0.014022300570434134 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
CyberHarem/nn_fireemblem
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of nn (Fire Emblem) This is the dataset of nn (Fire Emblem), containing 92 images and their tags. The core tags of this character are `green_hair, pointy_ears, long_hair, braid, ahoge, purple_eyes, twin_braids, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 92 | 93.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 92 | 58.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 200 | 120.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 92 | 83.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 200 | 159.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nn_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/nn_fireemblem', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cape, solo, gloves, garter_straps, thighhighs, boots, smile | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, thigh_boots, thighhighs, frills, full_body, garter_straps, red_footwear, short_dress, zettai_ryouiki, red_cape, simple_background, white_background, brown_gloves, open_mouth, white_dress, bag, arm_up, jewelry, long_sleeves, looking_at_viewer, ribbon, shiny_hair, smile | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | solo_focus, nipples, 1boy, 1girl, completely_nude, hetero, open_mouth, blush, pussy, smile, loli, navel, small_breasts, 2girls, heart, looking_at_viewer, penis, sex_from_behind | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | open_mouth, hair_flower, looking_at_viewer, rabbit_ears, 1girl, fake_animal_ears, official_alternate_costume, pink_gloves, solo, animal_hat, bunny_hat, dress, pantyhose, skirt, smile | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | solo | gloves | garter_straps | thighhighs | boots | smile | thigh_boots | frills | full_body | red_footwear | short_dress | zettai_ryouiki | red_cape | simple_background | white_background | brown_gloves | open_mouth | white_dress | bag | arm_up | jewelry | long_sleeves | looking_at_viewer | ribbon | shiny_hair | solo_focus | nipples | 1boy | completely_nude | hetero | blush | pussy | loli | navel | small_breasts | 2girls | heart | penis | sex_from_behind | hair_flower | rabbit_ears | fake_animal_ears | official_alternate_costume | pink_gloves | animal_hat | bunny_hat | dress | pantyhose | skirt | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------|:---------|:----------------|:-------------|:--------|:--------|:--------------|:---------|:------------|:---------------|:--------------|:-----------------|:-----------|:--------------------|:-------------------|:---------------|:-------------|:--------------|:------|:---------|:----------|:---------------|:--------------------|:---------|:-------------|:-------------|:----------|:-------|:------------------|:---------|:--------|:--------|:-------|:--------|:----------------|:---------|:--------|:--------|:------------------|:--------------|:--------------|:-------------------|:-----------------------------|:--------------|:-------------|:------------|:--------|:------------|:--------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | | | | | X | | | | | | | | | | | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | | | | X | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
ErfanMoosaviMonazzah/fake-news-detection-dataset-English
--- license: openrail task_categories: - text-classification language: - en tags: - fake news - text classification pretty_name: Fake News Detection Dataset (English) size_categories: - 10K<n<100K --- This is a cleaned and splitted version of this dataset (https://www.kaggle.com/datasets/sadikaljarif/fake-news-detection-dataset-english) <br> Labels: - Fake News: 0 - Real News: 1 <br> You can find the cleansing script at: https://github.com/ErfanMoosaviMonazzah/Fake-News-Detection
davidho27941/steins_gate_1k_v1.1
--- dataset_info: features: - name: instruction dtype: string - name: response dtype: string - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: text dtype: string splits: - name: train num_bytes: 2406799.8 num_examples: 855 - name: test num_bytes: 267422.2 num_examples: 95 download_size: 232436 dataset_size: 2674222.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
open-llm-leaderboard/details_KnutJaegersberg__platypus-1_8b
--- pretty_name: Evaluation run of KnutJaegersberg/platypus-1_8b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [KnutJaegersberg/platypus-1_8b](https://huggingface.co/KnutJaegersberg/platypus-1_8b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__platypus-1_8b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-04T23:54:13.264739](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__platypus-1_8b/blob/main/results_2024-01-04T23-54-13.264739.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3311382624677847,\n\ \ \"acc_stderr\": 0.03309340117681418,\n \"acc_norm\": 0.3354285679884653,\n\ \ \"acc_norm_stderr\": 0.03395064752932648,\n \"mc1\": 0.24724602203182375,\n\ \ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.407314806116824,\n\ \ \"mc2_stderr\": 0.01575648292147913\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.3174061433447099,\n \"acc_stderr\": 0.01360223908803817,\n\ \ \"acc_norm\": 0.33276450511945393,\n \"acc_norm_stderr\": 0.013769863046192309\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3979286994622585,\n\ \ \"acc_stderr\": 0.004884702412456094,\n \"acc_norm\": 0.5075682135032862,\n\ \ \"acc_norm_stderr\": 0.004989209770743239\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\ \ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.37037037037037035,\n\ \ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395268,\n\ \ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395268\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\ \ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.35094339622641507,\n \"acc_stderr\": 0.02937364625323469,\n\ \ \"acc_norm\": 0.35094339622641507,\n \"acc_norm_stderr\": 0.02937364625323469\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n\ \ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n\ \ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\ \ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.30057803468208094,\n\ \ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\ \ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.030976692998534432,\n\ \ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.030976692998534432\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\ \ \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n\ \ \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185554,\n\ \ \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185554\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.29365079365079366,\n \"acc_stderr\": 0.02345603738398203,\n \"\ acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.02345603738398203\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\ \ \"acc_stderr\": 0.03764950879790607,\n \"acc_norm\": 0.23015873015873015,\n\ \ \"acc_norm_stderr\": 0.03764950879790607\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3419354838709677,\n\ \ \"acc_stderr\": 0.026985289576552742,\n \"acc_norm\": 0.3419354838709677,\n\ \ \"acc_norm_stderr\": 0.026985289576552742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\ \ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\ : 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.3515151515151515,\n \"acc_stderr\": 0.0372820699868265,\n\ \ \"acc_norm\": 0.3515151515151515,\n \"acc_norm_stderr\": 0.0372820699868265\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.398989898989899,\n \"acc_stderr\": 0.03488901616852731,\n \"acc_norm\"\ : 0.398989898989899,\n \"acc_norm_stderr\": 0.03488901616852731\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.034801756684660366,\n\ \ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.034801756684660366\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654824,\n\ \ \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654824\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \ \ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.030176808288974337,\n\ \ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.030176808288974337\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\ acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3357798165137615,\n \"acc_stderr\": 0.020248081396752937,\n \"\ acc_norm\": 0.3357798165137615,\n \"acc_norm_stderr\": 0.020248081396752937\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012393,\n \"\ acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012393\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.35294117647058826,\n \"acc_stderr\": 0.033540924375915195,\n \"\ acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.033540924375915195\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.4430379746835443,\n \"acc_stderr\": 0.03233532777533484,\n \ \ \"acc_norm\": 0.4430379746835443,\n \"acc_norm_stderr\": 0.03233532777533484\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3991031390134529,\n\ \ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.3991031390134529,\n\ \ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847834,\n\ \ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847834\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.5041322314049587,\n \"acc_stderr\": 0.04564198767432754,\n \"\ acc_norm\": 0.5041322314049587,\n \"acc_norm_stderr\": 0.04564198767432754\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\ \ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\ \ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\ \ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\ \ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\ \ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258975,\n\ \ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258975\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5470085470085471,\n\ \ \"acc_stderr\": 0.03261099873098619,\n \"acc_norm\": 0.5470085470085471,\n\ \ \"acc_norm_stderr\": 0.03261099873098619\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4227330779054917,\n\ \ \"acc_stderr\": 0.017665180351954062,\n \"acc_norm\": 0.4227330779054917,\n\ \ \"acc_norm_stderr\": 0.017665180351954062\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.37283236994219654,\n \"acc_stderr\": 0.026033890613576277,\n\ \ \"acc_norm\": 0.37283236994219654,\n \"acc_norm_stderr\": 0.026033890613576277\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n\ \ \"acc_stderr\": 0.014931316703220513,\n \"acc_norm\": 0.2748603351955307,\n\ \ \"acc_norm_stderr\": 0.014931316703220513\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.35947712418300654,\n \"acc_stderr\": 0.027475969910660952,\n\ \ \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.027475969910660952\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.33440514469453375,\n\ \ \"acc_stderr\": 0.02679542232789394,\n \"acc_norm\": 0.33440514469453375,\n\ \ \"acc_norm_stderr\": 0.02679542232789394\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.38271604938271603,\n \"acc_stderr\": 0.027044538138402616,\n\ \ \"acc_norm\": 0.38271604938271603,\n \"acc_norm_stderr\": 0.027044538138402616\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.30141843971631205,\n \"acc_stderr\": 0.02737412888263115,\n \ \ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.02737412888263115\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3324641460234681,\n\ \ \"acc_stderr\": 0.012032022332260518,\n \"acc_norm\": 0.3324641460234681,\n\ \ \"acc_norm_stderr\": 0.012032022332260518\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.026799562024887678,\n\ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.026799562024887678\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.3545751633986928,\n \"acc_stderr\": 0.01935336054755369,\n \ \ \"acc_norm\": 0.3545751633986928,\n \"acc_norm_stderr\": 0.01935336054755369\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n\ \ \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.38181818181818183,\n\ \ \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.02866685779027465,\n\ \ \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.02866685779027465\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3582089552238806,\n\ \ \"acc_stderr\": 0.03390393042268815,\n \"acc_norm\": 0.3582089552238806,\n\ \ \"acc_norm_stderr\": 0.03390393042268815\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\ \ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\ \ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.3567251461988304,\n \"acc_stderr\": 0.03674013002860954,\n\ \ \"acc_norm\": 0.3567251461988304,\n \"acc_norm_stderr\": 0.03674013002860954\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\ \ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.407314806116824,\n\ \ \"mc2_stderr\": 0.01575648292147913\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5295974743488555,\n \"acc_stderr\": 0.014027843827840086\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \ \ \"acc_stderr\": 0.0018535550440036204\n }\n}\n```" repo_url: https://huggingface.co/KnutJaegersberg/platypus-1_8b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|arc:challenge|25_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-04T23-54-13.264739.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|gsm8k|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hellaswag|10_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-54-13.264739.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-management|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T23-54-13.264739.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|truthfulqa:mc|0_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-04T23-54-13.264739.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_04T23_54_13.264739 path: - '**/details_harness|winogrande|5_2024-01-04T23-54-13.264739.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-04T23-54-13.264739.parquet' - config_name: results data_files: - split: 2024_01_04T23_54_13.264739 path: - results_2024-01-04T23-54-13.264739.parquet - split: latest path: - results_2024-01-04T23-54-13.264739.parquet --- # Dataset Card for Evaluation run of KnutJaegersberg/platypus-1_8b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [KnutJaegersberg/platypus-1_8b](https://huggingface.co/KnutJaegersberg/platypus-1_8b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__platypus-1_8b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-04T23:54:13.264739](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__platypus-1_8b/blob/main/results_2024-01-04T23-54-13.264739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.3311382624677847, "acc_stderr": 0.03309340117681418, "acc_norm": 0.3354285679884653, "acc_norm_stderr": 0.03395064752932648, "mc1": 0.24724602203182375, "mc1_stderr": 0.015102404797359652, "mc2": 0.407314806116824, "mc2_stderr": 0.01575648292147913 }, "harness|arc:challenge|25": { "acc": 0.3174061433447099, "acc_stderr": 0.01360223908803817, "acc_norm": 0.33276450511945393, "acc_norm_stderr": 0.013769863046192309 }, "harness|hellaswag|10": { "acc": 0.3979286994622585, "acc_stderr": 0.004884702412456094, "acc_norm": 0.5075682135032862, "acc_norm_stderr": 0.004989209770743239 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.37037037037037035, "acc_stderr": 0.041716541613545426, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3355263157894737, "acc_stderr": 0.03842498559395268, "acc_norm": 0.3355263157894737, "acc_norm_stderr": 0.03842498559395268 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.35094339622641507, "acc_stderr": 0.02937364625323469, "acc_norm": 0.35094339622641507, "acc_norm_stderr": 0.02937364625323469 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3263888888888889, "acc_stderr": 0.03921067198982266, "acc_norm": 0.3263888888888889, "acc_norm_stderr": 0.03921067198982266 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.22, "acc_stderr": 0.041633319989322695, "acc_norm": 0.22, "acc_norm_stderr": 0.041633319989322695 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.30057803468208094, "acc_stderr": 0.0349610148119118, "acc_norm": 0.30057803468208094, "acc_norm_stderr": 0.0349610148119118 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.18627450980392157, "acc_stderr": 0.038739587141493524, "acc_norm": 0.18627450980392157, "acc_norm_stderr": 0.038739587141493524 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3404255319148936, "acc_stderr": 0.030976692998534432, "acc_norm": 0.3404255319148936, "acc_norm_stderr": 0.030976692998534432 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.21052631578947367, "acc_stderr": 0.0383515395439942, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.0383515395439942 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.32413793103448274, "acc_stderr": 0.03900432069185554, "acc_norm": 0.32413793103448274, "acc_norm_stderr": 0.03900432069185554 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29365079365079366, "acc_stderr": 0.02345603738398203, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.02345603738398203 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.23015873015873015, "acc_stderr": 0.03764950879790607, "acc_norm": 0.23015873015873015, "acc_norm_stderr": 0.03764950879790607 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3419354838709677, "acc_stderr": 0.026985289576552742, "acc_norm": 0.3419354838709677, "acc_norm_stderr": 0.026985289576552742 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2512315270935961, "acc_stderr": 0.030516530732694436, "acc_norm": 0.2512315270935961, "acc_norm_stderr": 0.030516530732694436 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.3515151515151515, "acc_stderr": 0.0372820699868265, "acc_norm": 0.3515151515151515, "acc_norm_stderr": 0.0372820699868265 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.398989898989899, "acc_stderr": 0.03488901616852731, "acc_norm": 0.398989898989899, "acc_norm_stderr": 0.03488901616852731 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.36787564766839376, "acc_stderr": 0.034801756684660366, "acc_norm": 0.36787564766839376, "acc_norm_stderr": 0.034801756684660366 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2692307692307692, "acc_stderr": 0.022489389793654824, "acc_norm": 0.2692307692307692, "acc_norm_stderr": 0.022489389793654824 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2518518518518518, "acc_stderr": 0.026466117538959912, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.026466117538959912 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.31512605042016806, "acc_stderr": 0.030176808288974337, "acc_norm": 0.31512605042016806, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.25165562913907286, "acc_stderr": 0.035433042343899844, "acc_norm": 0.25165562913907286, "acc_norm_stderr": 0.035433042343899844 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3357798165137615, "acc_stderr": 0.020248081396752937, "acc_norm": 0.3357798165137615, "acc_norm_stderr": 0.020248081396752937 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.26851851851851855, "acc_stderr": 0.030225226160012393, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.030225226160012393 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.35294117647058826, "acc_stderr": 0.033540924375915195, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.033540924375915195 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.4430379746835443, "acc_stderr": 0.03233532777533484, "acc_norm": 0.4430379746835443, "acc_norm_stderr": 0.03233532777533484 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3991031390134529, "acc_stderr": 0.03286745312567961, "acc_norm": 0.3991031390134529, "acc_norm_stderr": 0.03286745312567961 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2748091603053435, "acc_stderr": 0.03915345408847834, "acc_norm": 0.2748091603053435, "acc_norm_stderr": 0.03915345408847834 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5041322314049587, "acc_stderr": 0.04564198767432754, "acc_norm": 0.5041322314049587, "acc_norm_stderr": 0.04564198767432754 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4166666666666667, "acc_stderr": 0.04766075165356461, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3803680981595092, "acc_stderr": 0.03814269893261837, "acc_norm": 0.3803680981595092, "acc_norm_stderr": 0.03814269893261837 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2767857142857143, "acc_stderr": 0.042466243366976256, "acc_norm": 0.2767857142857143, "acc_norm_stderr": 0.042466243366976256 }, "harness|hendrycksTest-management|5": { "acc": 0.3786407766990291, "acc_stderr": 0.04802694698258975, "acc_norm": 0.3786407766990291, "acc_norm_stderr": 0.04802694698258975 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5470085470085471, "acc_stderr": 0.03261099873098619, "acc_norm": 0.5470085470085471, "acc_norm_stderr": 0.03261099873098619 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.4227330779054917, "acc_stderr": 0.017665180351954062, "acc_norm": 0.4227330779054917, "acc_norm_stderr": 0.017665180351954062 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.37283236994219654, "acc_stderr": 0.026033890613576277, "acc_norm": 0.37283236994219654, "acc_norm_stderr": 0.026033890613576277 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2748603351955307, "acc_stderr": 0.014931316703220513, "acc_norm": 0.2748603351955307, "acc_norm_stderr": 0.014931316703220513 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.35947712418300654, "acc_stderr": 0.027475969910660952, "acc_norm": 0.35947712418300654, "acc_norm_stderr": 0.027475969910660952 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.33440514469453375, "acc_stderr": 0.02679542232789394, "acc_norm": 0.33440514469453375, "acc_norm_stderr": 0.02679542232789394 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.38271604938271603, "acc_stderr": 0.027044538138402616, "acc_norm": 0.38271604938271603, "acc_norm_stderr": 0.027044538138402616 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.30141843971631205, "acc_stderr": 0.02737412888263115, "acc_norm": 0.30141843971631205, "acc_norm_stderr": 0.02737412888263115 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3324641460234681, "acc_stderr": 0.012032022332260518, "acc_norm": 0.3324641460234681, "acc_norm_stderr": 0.012032022332260518 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.2647058823529412, "acc_stderr": 0.026799562024887678, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.026799562024887678 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3545751633986928, "acc_stderr": 0.01935336054755369, "acc_norm": 0.3545751633986928, "acc_norm_stderr": 0.01935336054755369 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.38181818181818183, "acc_stderr": 0.04653429807913508, "acc_norm": 0.38181818181818183, "acc_norm_stderr": 0.04653429807913508 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.27755102040816326, "acc_stderr": 0.02866685779027465, "acc_norm": 0.27755102040816326, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.3582089552238806, "acc_stderr": 0.03390393042268815, "acc_norm": 0.3582089552238806, "acc_norm_stderr": 0.03390393042268815 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-virology|5": { "acc": 0.3192771084337349, "acc_stderr": 0.0362933532994786, "acc_norm": 0.3192771084337349, "acc_norm_stderr": 0.0362933532994786 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3567251461988304, "acc_stderr": 0.03674013002860954, "acc_norm": 0.3567251461988304, "acc_norm_stderr": 0.03674013002860954 }, "harness|truthfulqa:mc|0": { "mc1": 0.24724602203182375, "mc1_stderr": 0.015102404797359652, "mc2": 0.407314806116824, "mc2_stderr": 0.01575648292147913 }, "harness|winogrande|5": { "acc": 0.5295974743488555, "acc_stderr": 0.014027843827840086 }, "harness|gsm8k|5": { "acc": 0.004548900682335102, "acc_stderr": 0.0018535550440036204 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Tincando__fiction_story_generator
--- pretty_name: Evaluation run of Tincando/fiction_story_generator dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Tincando/fiction_story_generator](https://huggingface.co/Tincando/fiction_story_generator)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Tincando__fiction_story_generator\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-23T08:16:23.951568](https://huggingface.co/datasets/open-llm-leaderboard/details_Tincando__fiction_story_generator/blob/main/results_2023-10-23T08-16-23.951568.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016778523489932886,\n\ \ \"em_stderr\": 0.001315352636324007,\n \"f1\": 0.04902579697986584,\n\ \ \"f1_stderr\": 0.0017542824329442046,\n \"acc\": 0.2505919494869771,\n\ \ \"acc_stderr\": 0.007026223145264506\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.016778523489932886,\n \"em_stderr\": 0.001315352636324007,\n\ \ \"f1\": 0.04902579697986584,\n \"f1_stderr\": 0.0017542824329442046\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5011838989739542,\n\ \ \"acc_stderr\": 0.014052446290529012\n }\n}\n```" repo_url: https://huggingface.co/Tincando/fiction_story_generator leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|arc:challenge|25_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-07-19T19:20:01.774519.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_23T08_16_23.951568 path: - '**/details_harness|drop|3_2023-10-23T08-16-23.951568.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-23T08-16-23.951568.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_23T08_16_23.951568 path: - '**/details_harness|gsm8k|5_2023-10-23T08-16-23.951568.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-23T08-16-23.951568.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hellaswag|10_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:01.774519.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:01.774519.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_07_19T19_20_01.774519 path: - '**/details_harness|truthfulqa:mc|0_2023-07-19T19:20:01.774519.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-07-19T19:20:01.774519.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_23T08_16_23.951568 path: - '**/details_harness|winogrande|5_2023-10-23T08-16-23.951568.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-23T08-16-23.951568.parquet' - config_name: results data_files: - split: 2023_07_19T19_20_01.774519 path: - results_2023-07-19T19:20:01.774519.parquet - split: 2023_10_23T08_16_23.951568 path: - results_2023-10-23T08-16-23.951568.parquet - split: latest path: - results_2023-10-23T08-16-23.951568.parquet --- # Dataset Card for Evaluation run of Tincando/fiction_story_generator ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Tincando/fiction_story_generator - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Tincando/fiction_story_generator](https://huggingface.co/Tincando/fiction_story_generator) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Tincando__fiction_story_generator", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T08:16:23.951568](https://huggingface.co/datasets/open-llm-leaderboard/details_Tincando__fiction_story_generator/blob/main/results_2023-10-23T08-16-23.951568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.016778523489932886, "em_stderr": 0.001315352636324007, "f1": 0.04902579697986584, "f1_stderr": 0.0017542824329442046, "acc": 0.2505919494869771, "acc_stderr": 0.007026223145264506 }, "harness|drop|3": { "em": 0.016778523489932886, "em_stderr": 0.001315352636324007, "f1": 0.04902579697986584, "f1_stderr": 0.0017542824329442046 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5011838989739542, "acc_stderr": 0.014052446290529012 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
VIMA/VIMA-Data
--- license: cc-by-4.0 --- # Dataset Card for VIMA-Data ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Dataset Structure](#dataset-structure) - [Dataset Creation](#dataset-creation) - [Additional Information](#additional-information) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) ## Dataset Description - **Homepage:** https://vimalabs.github.io/ - **Repository:** https://github.com/vimalabs/VimaBench - **Paper:** https://arxiv.org/abs/2210.03094 ### Dataset Summary This is the official dataset used to train general robot manipulation agents with multimodal prompts, as presented in [paper](https://arxiv.org/abs/2210.03094). It contains 650K trajectories for 13 tasks in [VIMA-Bench](https://github.com/vimalabs/VimaBench). All demonstrations are generated by oracles. ## Dataset Structure Data are grouped into different tasks. Within each trajectory's folder, there are two folders `rgb_front` and `rgb_top`, and three files `obs.pkl`, `action.pkl`, and `trajectory.pkl`. RGB frames from a certain perspective are separately stored in corresponding folder. `obs.pkl` includes segmentation and state of end effector. `action.pkl` contains oracle actions. `trajectory.pkl` contains meta information such as elapsed steps, task information, and object information. Users can build their custom data piepline starting from here. More details and examples can be found [here](https://github.com/vimalabs/VimaBench#training-data). ## Dataset Creation All demonstrations are generated by scripted oracles. ## Additional Information ### Licensing Information This dataset is released under the [Creative Commons Attribution 4.0 International (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/legalcode) license. ### Citation Information If you find our work useful, please consider citing us! ```bibtex @inproceedings{jiang2023vima, title = {VIMA: General Robot Manipulation with Multimodal Prompts}, author = {Yunfan Jiang and Agrim Gupta and Zichen Zhang and Guanzhi Wang and Yongqiang Dou and Yanjun Chen and Li Fei-Fei and Anima Anandkumar and Yuke Zhu and Linxi Fan}, booktitle = {Fortieth International Conference on Machine Learning}, year = {2023} } ```
anan-2024/twitter_dataset_1713229323
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 80808 num_examples: 213 download_size: 49179 dataset_size: 80808 configs: - config_name: default data_files: - split: train path: data/train-* ---
warshakhan/DocVQA_spatial_HMP_data
--- dataset_info: features: - name: id dtype: string - name: image dtype: image - name: questions sequence: string - name: answers sequence: string - name: bbox sequence: sequence: int64 splits: - name: train num_bytes: 943093288.492 num_examples: 5598 - name: test num_bytes: 168846085.0 num_examples: 799 - name: validation num_bytes: 314036438.549 num_examples: 1601 download_size: 1610549527 dataset_size: 1425975812.0410001 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* ---
yoshitomo-matsubara/srsd-feynman_hard_dummy
--- pretty_name: SRSD-Feynman (Hard w/ Dummy Variables) annotations_creators: - expert language_creators: - expert-generated language: - en license: - cc-by-4.0 multilinguality: - monolingual size_categories: - 100K<n<1M source_datasets: - extended task_categories: - tabular-regression task_ids: [] --- # Dataset Card for SRSD-Feynman (Hard set with Dummy Variables) ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** - **Repository:** https://github.com/omron-sinicx/srsd-benchmark - **Paper:** [Rethinking Symbolic Regression Datasets and Benchmarks for Scientific Discovery](https://arxiv.org/abs/2206.10540) - **Point of Contact:** [Yoshitaka Ushiku](mailto:yoshitaka.ushiku@sinicx.com) ### Dataset Summary Our SRSD (Feynman) datasets are designed to discuss the performance of Symbolic Regression for Scientific Discovery. We carefully reviewed the properties of each formula and its variables in [the Feynman Symbolic Regression Database](https://space.mit.edu/home/tegmark/aifeynman.html) to design reasonably realistic sampling range of values so that our SRSD datasets can be used for evaluating the potential of SRSD such as whether or not an SR method con (re)discover physical laws from such datasets. This is the ***Hard set with dummy variables*** of our SRSD-Feynman datasets, which consists of the following 50 different physics formulas: [![Click here to open a PDF file](problem_table.png)](https://huggingface.co/datasets/yoshitomo-matsubara/srsd-feynman_hard_dummy/resolve/main/problem_table.pdf) Dummy variables were randomly generated, and symbolic regression models should not use the dummy variables as part of their predictions. The following datasets contain **1 dummy variable**: I.15.3x, I.30.3, II.6.15a, II.11.17, II.11.28, II.13.23, II.13.34, II.24.17, B1, B6, B12, B16, B17 **2 dummy variables**: I.6.20, I.6.20b, I.9.18, I.15.3t, I.29.16, I.34.14, I.39.22, I.44.4, II.11.20, II.11.27, II.35.18, III.9.52, III.10.19, III.21.20, B2, B3, B7, B9 **3 dummy variables**: I.6.20a, I.32.17, I.37.4, I.40.1, I.41.16, I.50.26, II.6.15b, II.35.21, II.36.38, III.4.33, B4, B5, B10, B11, B13, B14, B15, B19, B20 More details of these datasets are provided in [the paper and its supplementary material](https://openreview.net/forum?id=qrUdrXsiXX). ### Supported Tasks and Leaderboards Symbolic Regression ## Dataset Structure ### Data Instances Tabular data + Ground-truth equation per equation Tabular data: (num_samples, num_variables+1), where the last (rightmost) column indicate output of the target function for given variables. Note that the number of variables (`num_variables`) varies from equation to equation. Ground-truth equation: *pickled* symbolic representation (equation with symbols in sympy) of the target function. ### Data Fields For each dataset, we have 1. train split (txt file, whitespace as a delimiter) 2. val split (txt file, whitespace as a delimiter) 3. test split (txt file, whitespace as a delimiter) 4. true equation (pickle file for sympy object) ### Data Splits - train: 8,000 samples per equation - val: 1,000 samples per equation - test: 1,000 samples per equation ## Dataset Creation ### Curation Rationale We chose target equations based on [the Feynman Symbolic Regression Database](https://space.mit.edu/home/tegmark/aifeynman.html). ### Annotations #### Annotation process We significantly revised the sampling range for each variable from the annotations in the Feynman Symbolic Regression Database. First, we checked the properties of each variable and treat physical constants (e.g., light speed, gravitational constant) as constants. Next, variable ranges were defined to correspond to each typical physics experiment to confirm the physical phenomenon for each equation. In cases where a specific experiment is difficult to be assumed, ranges were set within which the corresponding physical phenomenon can be seen. Generally, the ranges are set to be sampled on log scales within their orders as 10^2 in order to take both large and small changes in value as the order changes. Variables such as angles, for which a linear distribution is expected are set to be sampled uniformly. In addition, variables that take a specific sign were set to be sampled within that range. #### Who are the annotators? The main annotators are - Naoya Chiba (@nchiba) - Ryo Igarashi (@rigarash) ### Personal and Sensitive Information N/A ## Considerations for Using the Data ### Social Impact of Dataset We annotated this dataset, assuming typical physical experiments. The dataset will engage research on symbolic regression for scientific discovery (SRSD) and help researchers discuss the potential of symbolic regression methods towards data-driven scientific discovery. ### Discussion of Biases Our choices of target equations are based on [the Feynman Symbolic Regression Database](https://space.mit.edu/home/tegmark/aifeynman.html), which are focused on a field of Physics. ### Other Known Limitations Some variables used in our datasets indicate some numbers (counts), which should be treated as integer. Due to the capacity of 32-bit integer, however, we treated some of such variables as float e.g., number of molecules (10^{23} - 10^{25}) ## Additional Information ### Dataset Curators The main curators are - Naoya Chiba (@nchiba) - Ryo Igarashi (@rigarash) ### Licensing Information Creative Commons Attribution 4.0 ### Citation Information [[OpenReview](https://openreview.net/forum?id=qrUdrXsiXX)] [[Video](https://www.youtube.com/watch?v=MmeOXuUUAW0)] [[Preprint](https://arxiv.org/abs/2206.10540)] ```bibtex @article{matsubara2024rethinking, title={Rethinking Symbolic Regression Datasets and Benchmarks for Scientific Discovery}, author={Matsubara, Yoshitomo and Chiba, Naoya and Igarashi, Ryo and Ushiku, Yoshitaka}, journal={Journal of Data-centric Machine Learning Research}, year={2024}, url={https://openreview.net/forum?id=qrUdrXsiXX} } ``` ### Contributions Authors: - Yoshitomo Matsubara (@yoshitomo-matsubara) - Naoya Chiba (@nchiba) - Ryo Igarashi (@rigarash) - Yoshitaka Ushiku (@yushiku)
alimrb/digikala_faqs
--- dataset_info: features: - name: answer dtype: string splits: - name: train num_bytes: 1893 num_examples: 5 download_size: 3800 dataset_size: 1893 configs: - config_name: default data_files: - split: train path: data/train-* ---
mokshaannigeri/CS482_Housing_Dataset
--- dataset_info: features: - name: longitude dtype: float64 - name: latitude dtype: float64 - name: housing_median_age dtype: float64 - name: total_rooms dtype: float64 - name: total_bedrooms dtype: float64 - name: population dtype: float64 - name: households dtype: float64 - name: median_income dtype: float64 - name: median_house_value dtype: float64 - name: ocean_proximity dtype: string splits: - name: train num_bytes: 1737680 num_examples: 20640 download_size: 824144 dataset_size: 1737680 configs: - config_name: default data_files: - split: train path: data/train-* language: - en pretty_name: b ---
takaaki-inada/databricks-dolly-15k-ja-zundamon
--- license: cc-by-sa-3.0 --- This dataset was based on "kunishou/databricks-dolly-15k-ja". This dataset is licensed under CC BY SA 3.0 Last Update : 2023-05-11 databricks-dolly-15k-ja https://github.com/kunishou/databricks-dolly-15k-ja databricks-dolly-15k https://github.com/databrickslabs/dolly/tree/master/data
ernestomccormick/pdv_generated_qa_final_users_1
--- license: mit ---
Hikam/HospitalityReviews
--- license: apache-2.0 ---
qgiaohc/twitter_dataset_1713195798
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 30358 num_examples: 71 download_size: 17932 dataset_size: 30358 configs: - config_name: default data_files: - split: train path: data/train-* ---
Hack90/ncbi_genbank_part_27
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: id dtype: string - name: sequence dtype: string - name: name dtype: string - name: description dtype: string - name: features dtype: int64 - name: seq_length dtype: int64 splits: - name: train num_bytes: 32872955172 num_examples: 183931 download_size: 14877735882 dataset_size: 32872955172 --- # Dataset Card for "ncbi_genbank_part_27" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
W1lson/Book4
--- dataset_info: features: - name: Source ID dtype: int64 - name: Primary Text dtype: string splits: - name: train num_bytes: 9831 num_examples: 87 download_size: 0 dataset_size: 9831 --- # Dataset Card for "Book4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
thomaslu/articulationGAN_finetuning_data
--- configs: - config_name: default data_files: - split: test path: data/test-* - split: train path: data/train-* dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 - name: sentence dtype: string splits: - name: test num_bytes: 9100963.0 num_examples: 111 - name: train num_bytes: 32796309.0 num_examples: 400 download_size: 41933143 dataset_size: 41897272.0 --- # Dataset Card for "articulationGAN_finetuning_data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
SakisAth/workshop_bloom
--- dataset_info: features: - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 38390 num_examples: 150 download_size: 6482 dataset_size: 38390 --- # Dataset Card for "workshop_bloom" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
adalib/monkey-cond-gen
--- dataset_info: features: - name: code dtype: string splits: - name: train num_bytes: 1727649572 num_examples: 165193 download_size: 621840619 dataset_size: 1727649572 configs: - config_name: default data_files: - split: train path: data/train-* ---
pravsels/AnimationsWithManim_Elteoremadebeethoven_code
--- dataset_info: features: - name: file_path dtype: string - name: content dtype: string splits: - name: train num_bytes: 602446 num_examples: 76 download_size: 174860 dataset_size: 602446 configs: - config_name: default data_files: - split: train path: data/train-* ---
coyotespike/mydataset
--- license: mit ---
DoubleML/multimodal_confounding
--- license: bsd-3-clause dataset_info: features: - name: cond_exp_y dtype: float64 - name: m1 dtype: float64 - name: g1 dtype: float64 - name: l1 dtype: float64 - name: Y dtype: float64 - name: D_1 dtype: float64 - name: carat dtype: float64 - name: depth dtype: float64 - name: table dtype: float64 - name: price dtype: float64 - name: x dtype: float64 - name: y dtype: float64 - name: z dtype: float64 - name: review dtype: string - name: sentiment dtype: string - name: label dtype: int64 - name: cut_Good dtype: bool - name: cut_Ideal dtype: bool - name: cut_Premium dtype: bool - name: cut_Very Good dtype: bool - name: color_E dtype: bool - name: color_F dtype: bool - name: color_G dtype: bool - name: color_H dtype: bool - name: color_I dtype: bool - name: color_J dtype: bool - name: clarity_IF dtype: bool - name: clarity_SI1 dtype: bool - name: clarity_SI2 dtype: bool - name: clarity_VS1 dtype: bool - name: clarity_VS2 dtype: bool - name: clarity_VVS1 dtype: bool - name: clarity_VVS2 dtype: bool - name: image dtype: image splits: - name: train num_bytes: 185209908.0 num_examples: 50000 download_size: 174280492 dataset_size: 185209908.0 tags: - Causal Inference size_categories: - 10K<n<100K --- # Dataset Card Semi-synthetic dataset with multimodal confounding. The dataset is generated according to the description in [DoubleMLDeep: Estimation of Causal Effects with Multimodal Data](https://arxiv.org/abs/2402.01785). ## Dataset Details ### Dataset Description & Usage The dataset is a semi-synthetic dataset as a benchmark for treatment effect estimation with multimodal confounding. The outcome variable `Y` is generated according to a partially linear model $$ Y = \theta_0 D_1 + g_1(X) + \varepsilon $$ with an constant treatment effect of $$\theta_0=0.5.$$ The target variables `sentiment`, `label` and `price` are used to generate credible confounding by affecting both `Y` and `D_1`. This confounding is generated to be negative, such that estimates of the treatment effect should generally be smaller than `0.5`. For a more detailed description on the data generating process, see [DoubleMLDeep: Estimation of Causal Effects with Multimodal Data](https://arxiv.org/abs/2402.01785). The dataset includes the corresponding target variables `sentiment`, `label`, `price` and oracle values such as `cond_exp_y`, `l1`, `m1`, `g1`. These values are included for convenience for e.g. benchmarking against optimal estimates, but should not be used in the model. Further, several tabular features are highly correlated, such that it may be helpful to drop features such as `x`, `y`, `z`. An example looks as follows: ``` {'cond_exp_y': 2.367230022801451, 'm1': -2.7978920933712907, 'g1': 4.015536418538365, 'l1': 2.61659037185272, 'Y': 3.091541535115522, 'D_1': -3.2966127914738275, 'carat': 0.5247285289349821, 'depth': 58.7, 'table': 59.0, 'price': 9.7161333532141, 'x': 7.87, 'y': 7.78, 'z': 4.59, 'review': "I really liked this Summerslam due to the look of the arena, the curtains and just the look overall was interesting to me for some reason. Anyways, this could have been one of the best Summerslam's ever if the WWF didn't have Lex Luger in the main event against Yokozuna, now for it's time it was ok to have a huge fat man vs a strong man but I'm glad times have changed. It was a terrible main event just like every match Luger is in is terrible. Other matches on the card were Razor Ramon vs Ted Dibiase, Steiner Brothers vs Heavenly Bodies, Shawn Michaels vs Curt Hening, this was the event where Shawn named his big monster of a body guard Diesel, IRS vs 1-2-3 Kid, Bret Hart first takes on Doink then takes on Jerry Lawler and stuff with the Harts and Lawler was always very interesting, then Ludvig Borga destroyed Marty Jannetty, Undertaker took on Giant Gonzalez in another terrible match, The Smoking Gunns and Tatanka took on Bam Bam Bigelow and the Headshrinkers, and Yokozuna defended the world title against Lex Luger this match was boring and it has a terrible ending. However it deserves 8/10", 'sentiment': 'positive', 'label': 6, 'cut_Good': False, 'cut_Ideal': False, 'cut_Premium': True, 'cut_Very Good': False, 'color_E': False, 'color_F': True, 'color_G': False, 'color_H': False, 'color_I': False, 'color_J': False, 'clarity_IF': False, 'clarity_SI1': False, 'clarity_SI2': False, 'clarity_VS1': False, 'clarity_VS2': True, 'clarity_VVS1': False, 'clarity_VVS2': False, 'image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=32x32>} ``` ### Dataset Sources The dataset is based on the three commonly used datasets: - [Diamonds dataset](https://www.kaggle.com/datasets/shivam2503/diamonds) - [IMDB dataset](https://huggingface.co/datasets/imdb) - [CIFAR-10 dataset](https://www.cs.toronto.edu/~kriz/cifar.html) The versions to create this dataset can be found on Kaggle: - [Diamonds dataset (Kaggle)](https://www.kaggle.com/datasets/shivam2503/diamonds) - [IMDB dataset (Kaggle)](https://www.kaggle.com/datasets/lakshmi25npathi/imdb-dataset-of-50k-movie-reviews?select=IMDB+Dataset.csv) - [CIFAR-10 dataset (Kaggle)](https://www.kaggle.com/datasets/swaroopkml/cifar10-pngs-in-folders) The original citations can be found below. ### Dataset Preprocessing All datasets are subsampled to be of equal size (`50,000`). The CIFAR-10 data is based on the trainings dataset, whereas the IMDB data contains train and test data to obtain `50,000` observations. The labels of the CIFAR-10 data are set to integer values `0` to `9`. The Diamonds dataset is cleaned (values with `x`, `y`, `z` equal to `0` are removed) and outliers are dropped (such that `45<depth<75`, `40<table<80`, `x<30`, `y<30` and `2<z<30`). The remaining `53,907` observations are downsampled to the same size of `50,000` observations. Further `price` and `carat` are transformed with the natural logarithm and `cut`, `color` and `clarity` are dummy coded (with baselines `Fair`, `D` and `I1`). ## Uses The dataset should as a benchmark to compare different causal inference methods for observational data under multimodal confounding. ## Dataset Structure ### Data Instances ### Data Fields The data fields can be devided into several categories: - **Outcome and Treatments** - `Y` (`float64`): Outcome of interest - `D_1` (`float64`): Treatment value - **Text Features** - `review` (`string`): IMDB review text - `sentiment` (`string`): Corresponding sentiment, either `positive` or `negative` - **Image Features** - `image` (`image`): Image - `label` (`int64`): Corresponding label from `0` to `9` - **Tabular Features** - `price` (`float64`): Logarithm of the price in US dollars - `carat` (`float64`): Logarithm of the weight of the diamond - `x` (`float64`): Length in mm - `y` (`float64`): Width in mm - `z` (`float64`): Depth in mm - `depth` (`float64`): Total depth percentage - `table` (`float64`): Width of top of diamond relative to widest point - **Cut**: Quality of the cut (`Fair`, `Good`, `Very Good`, `Premium`, `Ideal`) (dummy coded with `Fair` as baseline) - `cut_Good` (`bool`) - `cut_Very Good` (`bool`) - `cut_Premium` (`bool`) - `cut_Ideal` (`bool`) - **Color**: Diamond color, from `J`(worst) to `D`(best) (dummy coded with `D` as baseline) - `color_E` (`bool`) - `color_F` (`bool`) - `color_G` (`bool`) - `color_H` (`bool`) - `color_I` (`bool`) - `color_J` (`bool`) - **Clarity**: Measurement of diamond clarity (`I1` (worst), `SI2`, `SI1`, `VS2`, `VS1`, `VVS2`, `VVS1`, `IF` (best)) (dummy coded with `I1` as baseline) - `clarity_SI2` (`bool`) - `clarity_SI1` (`bool`) - `clarity_VS2` (`bool`) - `clarity_VS1` (`bool`) - `clarity_VVS2` (`bool`) - `clarity_VVS1` (`bool`) - `clarity_IF` (`bool`) - **Oracle Features** - `cond_exp_y` (`float64`): Expected value of `Y` conditional on `D_1`, `sentiment`, `label` and `price` - `l1` (`float64`): Expected value of `Y` conditional on `sentiment`, `label` and `price` - `m1` (`float64`): Expected value of `D_1` conditional on `sentiment`, `label` and `price` - `g1` (`float64`): Additive component of `Y` based on `sentiment`, `label` and `price` (see Dataset Description) ## Limitations As the confounding is generated via original labels, completely removing the confounding might not be possible. ## Citation Information ### Dataset Citation If you use the dataset please cite this article: ``` @article{klaassen2024doublemldeep, title={DoubleMLDeep: Estimation of Causal Effects with Multimodal Data}, author={Klaassen, Sven and Teichert-Kluge, Jan and Bach, Philipp and Chernozhukov, Victor and Spindler, Martin and Vijaykumar, Suhas}, journal={arXiv preprint arXiv:2402.01785}, year={2024} } ``` ### Dataset Sources The three original datasets can be cited via Diamonds dataset: ``` @Book{ggplot2_book, author = {Hadley Wickham}, title = {ggplot2: Elegant Graphics for Data Analysis}, publisher = {Springer-Verlag New York}, year = {2016}, isbn = {978-3-319-24277-4}, url = {https://ggplot2.tidyverse.org}, } ``` IMDB dataset: ``` @InProceedings{maas-EtAl:2011:ACL-HLT2011, author = {Maas, Andrew L. and Daly, Raymond E. and Pham, Peter T. and Huang, Dan and Ng, Andrew Y. and Potts, Christopher}, title = {Learning Word Vectors for Sentiment Analysis}, booktitle = {Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies}, month = {June}, year = {2011}, address = {Portland, Oregon, USA}, publisher = {Association for Computational Linguistics}, pages = {142--150}, url = {http://www.aclweb.org/anthology/P11-1015} } ``` CIFAR-10 dataset: ``` @TECHREPORT{Krizhevsky09learningmultiple, author = {Alex Krizhevsky}, title = {Learning multiple layers of features from tiny images}, institution = {}, year = {2009} } ``` ## Dataset Card Authors Sven Klaassen
gcaillaut/frwiki_good_pages_el
--- annotations_creators: - machine-generated language_creators: [] language: - fr license: - wtfpl multilinguality: - monolingual pretty_name: test size_categories: - unknown source_datasets: - original task_categories: - other task_ids: [] --- # Dataset Card for frwiki_good_pages_el ## Dataset Description - Repository: [frwiki_good_pages_el](https://github.com/GaaH/frwiki_good_pages_el) - Point of Contact: [Gaëtan Caillaut](mailto://g.caillaut@brgm.fr) ### Dataset Summary This dataset contains _featured_ and _good_ articles from the French Wikipédia. Pages are downloaded, as HTML files, from the [French Wikipedia website](https://fr.wikipedia.org). It is intended to be used to train Entity Linking (EL) systems. Links in articles are used to detect named entities. ### Languages - French ## Dataset Structure ``` { "title": "Title of the page", "qid": "QID of the corresponding Wikidata entity", "words": ["tokens"], "wikipedia": ["Wikipedia description of each entity"], "wikidata": ["Wikidata description of each entity"], "labels": ["NER labels"], "titles": ["Wikipedia title of each entity"], "qids": ["QID of each entity"], } ``` The `words` field contains the article’s text splitted on white-spaces. The other fields are list with same length as `words` and contains data only when the respective token in `words` is the __start of an entity__. For instance, if the _i-th_ token in `words` is an entity, then the _i-th_ element of `wikipedia` contains a description, extracted from Wikipedia, of this entity. The same applies for the other fields. If the entity spans multiple words, then only the index of the first words contains data. The only exception is the `labels` field, which is used to delimit entities. It uses the IOB encoding: if the token is not part of an entity, the label is `"O"`; if it is the first word of a multi-word entity, the label is `"B"`; otherwise the label is `"I"`.
version-control/ds-lib-version-2
--- dataset_info: features: - name: repo_name dtype: string - name: version list: - name: pyproject.toml struct: - name: matplotlib dtype: string - name: numpy dtype: string - name: pandas dtype: string - name: scikit-learn dtype: string - name: scipy dtype: string - name: tensorflow dtype: string - name: torch dtype: string - name: requirements.txt struct: - name: matplotlib dtype: string - name: numpy dtype: string - name: pandas dtype: string - name: scikit-learn dtype: string - name: scipy dtype: string - name: tensorflow dtype: string - name: torch dtype: string - name: setup.py struct: - name: matplotlib dtype: string - name: numpy dtype: string - name: pandas dtype: string - name: scikit-learn dtype: string - name: scipy dtype: string - name: tensorflow dtype: string - name: torch dtype: string - name: hexsha sequence: string splits: - name: train num_bytes: 2516181 num_examples: 10000 download_size: 827987 dataset_size: 2516181 configs: - config_name: default data_files: - split: train path: data/train-* ---
liuyanchen1015/MULTI_VALUE_cola_regularized_reflexives_object_pronouns
--- dataset_info: features: - name: sentence dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 319 num_examples: 7 - name: test num_bytes: 367 num_examples: 6 - name: train num_bytes: 3466 num_examples: 55 download_size: 7580 dataset_size: 4152 --- # Dataset Card for "MULTI_VALUE_cola_regularized_reflexives_object_pronouns" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
santyzenith/embeddings_frases
--- license: mit ---
AdapterOcean/python3-standardized_cluster_19
--- dataset_info: features: - name: text dtype: string - name: conversation_id dtype: int64 - name: embedding sequence: float64 - name: cluster dtype: int64 splits: - name: train num_bytes: 57228850 num_examples: 5223 download_size: 12343048 dataset_size: 57228850 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "python3-standardized_cluster_19" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
hynky/alpacc-contract-summ
--- dataset_info: features: - name: output dtype: string - name: instruction dtype: string splits: - name: train num_bytes: 26403302 num_examples: 4560 download_size: 12636847 dataset_size: 26403302 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "alpacc-contract-summ" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
codeparrot/instructhumaneval
--- dataset_info: features: - name: task_id dtype: string - name: prompt dtype: string - name: canonical_solution dtype: string - name: test dtype: string - name: entry_point dtype: string - name: signature dtype: string - name: docstring dtype: string - name: context dtype: string - name: instruction dtype: string splits: - name: test num_bytes: 335913 num_examples: 164 download_size: 161076 dataset_size: 335913 --- # Instruct HumanEval ## Summary InstructHumanEval is a modified version of OpenAI HumanEval. For a given prompt, we extracted its signature, its docstring as well as its header to create a flexing setting which would allow to evaluation instruction-tuned LLM. The delimiters used in the instruction-tuning procedure can be use to build and instruction that would allow the model to elicit its best capabilities. Here is an example of use The prompt can be built as follows, depending on the model's instruction tuning delimiters ```python from datasets import load_dataset ds = load_dataset("codeparrot/instructhumaneval", split="test", use_auth_token=True) prompt_0 = "Human\n" + ds[0]["instruction"] + "\nAssistant\n" + ds[0]["context"] print(prompt_0) ``` Output ``` Human: Write a function has_close_elements(numbers: List[float], threshold: float) -> bool to solve the following problem: Check if in given list of numbers, are any two numbers closer to each other than given threshold. >>> has_close_elements([1.0, 2.0, 3.0], 0.5) False >>> has_close_elements([1.0, 2.8, 3.0, 4.0, 5.0, 2.0], 0.3) True Assistant: from typing import List def has_close_elements(numbers: List[float], threshold: float) -> bool: ``` The model can therefore complete the instruction and yield better results because it fits its training procedure. You can also find the code to evaluate models on the dataset in the [BigCode-evaluation-harness](https://github.com/bigcode-project/bigcode-evaluation-harness/tree/main). The following sections provide more details on the dataset. ## Dataset description This dataset is a modified version of [OpenAI HumanEval](https://huggingface.co/datasets/openai_humaneval) that is designed to adapt the benchmark to instruction fine-tuned models. As a matter of fact, HumanEval evaluates the ability to complete a code given its signature, its docstring and potentially some auxiliary functions. ## Dataset construction In order to build an instruction version of HumanEval we extracted relevant information from the **prompt** column of the original version - **signature** : this is the signature of the function to complete. It looks like `def function_name(args:type):-> return_type`. - **docstring** : this is the docstring of the function. It is the text which describes the purpose of the function. - **context** : this represents every additional information that is provided in order to help the model complete the function. It includes the imports and the auxiliary functions. Our idea was to move from the original format of HumanEval ``` <context> <signature> <docstring> ``` And build and **instruction** that would be ``` Write a function <signature> to solve the following problem: <docstring> ``` From this instruction, we can design an evaluation pipeline for instruction fine-tuned languages models. ## Evaluation Instruction fine-tuned LLM are built by fine-tuning a base LLM on an instruction dataset. This instruction dataset contains several <input, output> pairs where each represent an instruction submitted by a user together with the right answer to it. These pairs are framed into a multi-turn conversation with the help of special tokens which design each member of the interaction e.g. Q user_token `Human:`, an assistant_token `Assistant:` and and `end_token` `\n` that designates the end of each turn. ### Code completion In this case, the LLM is provided with the following prompt ``` user_token + <instruction> + <end_token> + <assistant_token> + <context> ``` It is the expected to complete the function to solve the problem formulated by the `instruction`. It is very similar to the original evaluation with the advantage that it puts the model in the best condition to understand the task that it is asked to solve. The evaluation is done on the part generated after `<assistant_token>`. ### Docstring to code This setting is more complicated as it requires to model to account for the information contained in the instruction such as the function signature. The LLM is provided with the following prompt ``` user_token + <instruction> + <end_token> + <assistant_token> ``` The model has to generate a function with the correct signature that solve adequately the problem. The evaluation is done by identifying the content of the function in the generation (by search for the right `entry_point`/`function_name`) and concatenating it with the `<context>` provided. ## How to use the dataset ```python from datasets import load_dataset ds = load_dataset("codeparrot/instructhumaneval") ``` ``` ds DatasetDict({ test: Dataset({ features: ['task_id', 'prompt', 'canonical_solution', 'test', 'entry_point', 'signature', 'docstring', 'context', 'instruction'], num_rows: 164 }) }) ```
vwxyzjn/ultrafeedback_binarized_1708454270
--- dataset_info: features: - name: prompt dtype: string - name: prompt_id dtype: string - name: chosen list: - name: content dtype: string - name: role dtype: string - name: rejected list: - name: content dtype: string - name: role dtype: string - name: messages list: - name: content dtype: string - name: role dtype: string - name: score_chosen dtype: float64 - name: score_rejected dtype: float64 - name: query list: - name: content dtype: string - name: role dtype: string - name: query_token sequence: int64 - name: query_token_len dtype: int64 - name: query_chosen_token sequence: int64 - name: query_chosen_token_len dtype: int64 - name: chosen_token sequence: int64 - name: chosen_token_len dtype: int64 - name: query_rejected_token sequence: int64 - name: query_rejected_token_len dtype: int64 - name: rejected_token sequence: int64 - name: rejected_token_len dtype: int64 splits: - name: test_prefs num_bytes: 17073037.442 num_examples: 796 - name: train_prefs num_bytes: 532533995.7192443 num_examples: 24488 download_size: 116256361 dataset_size: 549607033.1612443 --- # Dataset Card for "ultrafeedback_binarized_1708454270" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2
--- pretty_name: Evaluation run of MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2](https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-09T20:33:17.443758](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2/blob/main/results_2024-03-09T20-33-17.443758.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.615849136003834,\n\ \ \"acc_stderr\": 0.032970270965001755,\n \"acc_norm\": 0.620423209333745,\n\ \ \"acc_norm_stderr\": 0.03363666991029327,\n \"mc1\": 0.47368421052631576,\n\ \ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6421524161247847,\n\ \ \"mc2_stderr\": 0.01506624561829692\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.014434138713379977,\n\ \ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6369249153555069,\n\ \ \"acc_stderr\": 0.004799034356969391,\n \"acc_norm\": 0.8298147779326828,\n\ \ \"acc_norm_stderr\": 0.0037502741958275972\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\ \ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\ \ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\ \ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\ \ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\ : 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\ \ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\ \ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\ \ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\ \ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\ \ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\ \ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\ \ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\ \ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"\ acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\ \ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\ \ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n\ \ \"acc_stderr\": 0.026069362295335134,\n \"acc_norm\": 0.7,\n \ \ \"acc_norm_stderr\": 0.026069362295335134\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\ \ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\ : 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\ acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306422,\n\ \ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306422\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885113,\n\ \ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885113\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \ \ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\ \ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\ acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8128440366972477,\n \"acc_stderr\": 0.01672268452620015,\n \"\ acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.01672268452620015\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\ acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\ acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \ \ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\ \ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\ \ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\ \ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\ acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\ \ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\ \ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\ \ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\ \ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\ \ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\ \ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\ \ \"acc_stderr\": 0.014836205167333555,\n \"acc_norm\": 0.7790549169859514,\n\ \ \"acc_norm_stderr\": 0.014836205167333555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608408,\n\ \ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608408\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n\ \ \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n\ \ \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\ \ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\ \ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\ \ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n\ \ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \ \ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n\ \ \"acc_stderr\": 0.012689708167787687,\n \"acc_norm\": 0.4439374185136897,\n\ \ \"acc_norm_stderr\": 0.012689708167787687\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\ \ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6176470588235294,\n \"acc_stderr\": 0.019659922493623354,\n \ \ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.019659922493623354\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\ \ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\ \ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\ \ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\ \ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\ \ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \ \ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\ acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47368421052631576,\n\ \ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6421524161247847,\n\ \ \"mc2_stderr\": 0.01506624561829692\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774092\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42153146322971946,\n \ \ \"acc_stderr\": 0.013601824409483267\n }\n}\n```" repo_url: https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|arc:challenge|25_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-09T20-33-17.443758.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|gsm8k|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hellaswag|10_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-33-17.443758.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-management|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-33-17.443758.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|truthfulqa:mc|0_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-09T20-33-17.443758.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_09T20_33_17.443758 path: - '**/details_harness|winogrande|5_2024-03-09T20-33-17.443758.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-09T20-33-17.443758.parquet' - config_name: results data_files: - split: 2024_03_09T20_33_17.443758 path: - results_2024-03-09T20-33-17.443758.parquet - split: latest path: - results_2024-03-09T20-33-17.443758.parquet --- # Dataset Card for Evaluation run of MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2](https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-09T20:33:17.443758](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2/blob/main/results_2024-03-09T20-33-17.443758.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.615849136003834, "acc_stderr": 0.032970270965001755, "acc_norm": 0.620423209333745, "acc_norm_stderr": 0.03363666991029327, "mc1": 0.47368421052631576, "mc1_stderr": 0.017479241161975526, "mc2": 0.6421524161247847, "mc2_stderr": 0.01506624561829692 }, "harness|arc:challenge|25": { "acc": 0.5776450511945392, "acc_stderr": 0.014434138713379977, "acc_norm": 0.6203071672354948, "acc_norm_stderr": 0.014182119866974872 }, "harness|hellaswag|10": { "acc": 0.6369249153555069, "acc_stderr": 0.004799034356969391, "acc_norm": 0.8298147779326828, "acc_norm_stderr": 0.0037502741958275972 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6578947368421053, "acc_stderr": 0.03860731599316091, "acc_norm": 0.6578947368421053, "acc_norm_stderr": 0.03860731599316091 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493864, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03942082639927213, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03942082639927213 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.03724249595817731, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.03724249595817731 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.032469569197899575, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.032469569197899575 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.39473684210526316, "acc_stderr": 0.045981880578165414, "acc_norm": 0.39473684210526316, "acc_norm_stderr": 0.045981880578165414 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.041227371113703316, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.041227371113703316 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3915343915343915, "acc_stderr": 0.025138091388851105, "acc_norm": 0.3915343915343915, "acc_norm_stderr": 0.025138091388851105 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768177, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768177 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7, "acc_stderr": 0.026069362295335134, "acc_norm": 0.7, "acc_norm_stderr": 0.026069362295335134 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5270935960591133, "acc_stderr": 0.03512819077876106, "acc_norm": 0.5270935960591133, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306422, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306422 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5666666666666667, "acc_stderr": 0.025124653525885113, "acc_norm": 0.5666666666666667, "acc_norm_stderr": 0.025124653525885113 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.02803792996911499, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.02803792996911499 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8128440366972477, "acc_stderr": 0.01672268452620015, "acc_norm": 0.8128440366972477, "acc_norm_stderr": 0.01672268452620015 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4675925925925926, "acc_stderr": 0.03402801581358966, "acc_norm": 0.4675925925925926, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7696078431372549, "acc_stderr": 0.029554292605695066, "acc_norm": 0.7696078431372549, "acc_norm_stderr": 0.029554292605695066 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.02747974455080851, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.02747974455080851 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6143497757847534, "acc_stderr": 0.03266842214289201, "acc_norm": 0.6143497757847534, "acc_norm_stderr": 0.03266842214289201 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7099236641221374, "acc_stderr": 0.03980066246467766, "acc_norm": 0.7099236641221374, "acc_norm_stderr": 0.03980066246467766 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.042365112580946336, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.03462419931615623, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.03462419931615623 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690879, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690879 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179333, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7790549169859514, "acc_stderr": 0.014836205167333555, "acc_norm": 0.7790549169859514, "acc_norm_stderr": 0.014836205167333555 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6763005780346821, "acc_stderr": 0.025190181327608408, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.025190181327608408 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.394413407821229, "acc_stderr": 0.01634538676210397, "acc_norm": 0.394413407821229, "acc_norm_stderr": 0.01634538676210397 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7124183006535948, "acc_stderr": 0.02591780611714716, "acc_norm": 0.7124183006535948, "acc_norm_stderr": 0.02591780611714716 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6784565916398714, "acc_stderr": 0.026527724079528872, "acc_norm": 0.6784565916398714, "acc_norm_stderr": 0.026527724079528872 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6759259259259259, "acc_stderr": 0.026041766202717156, "acc_norm": 0.6759259259259259, "acc_norm_stderr": 0.026041766202717156 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4439374185136897, "acc_stderr": 0.012689708167787687, "acc_norm": 0.4439374185136897, "acc_norm_stderr": 0.012689708167787687 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6323529411764706, "acc_stderr": 0.02928941340940319, "acc_norm": 0.6323529411764706, "acc_norm_stderr": 0.02928941340940319 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6176470588235294, "acc_stderr": 0.019659922493623354, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.019659922493623354 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940588, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940588 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.0289205832206756, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.0289205832206756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7910447761194029, "acc_stderr": 0.028748298931728655, "acc_norm": 0.7910447761194029, "acc_norm_stderr": 0.028748298931728655 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.47368421052631576, "mc1_stderr": 0.017479241161975526, "mc2": 0.6421524161247847, "mc2_stderr": 0.01506624561829692 }, "harness|winogrande|5": { "acc": 0.7758484609313339, "acc_stderr": 0.011720400740774092 }, "harness|gsm8k|5": { "acc": 0.42153146322971946, "acc_stderr": 0.013601824409483267 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Vincentnien/bigcode-pii-pjj_checks
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: repo_id dtype: string - name: file_path dtype: string - name: content dtype: string - name: __index_level_0__ dtype: int64 - name: index dtype: int64 - name: secrets dtype: string - name: has_secrets dtype: bool - name: number_secrets dtype: int64 - name: new_content dtype: string - name: modified dtype: bool - name: references dtype: string splits: - name: train num_bytes: 498783.18073461443 num_examples: 46 download_size: 0 dataset_size: 498783.18073461443 --- # Dataset Card for "bigcode-pii-pjj_checks" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
NickyNicky/oasst2_chatml
--- dataset_info: features: - name: Text dtype: string splits: - name: train num_bytes: 35636342 num_examples: 13848 download_size: 19635797 dataset_size: 35636342 configs: - config_name: default data_files: - split: train path: data/train-* language: - en - es - ru - zh - de - fr - th - ca - it - ja - pl - eo - eu - vi - fi - hu - ar - nl - da - tr - ko - he - id - cs - bn - sv --- ``` link: https://huggingface.co/datasets/OpenAssistant/oasst2 ``` Message counts by language: - en: 64,513 - es: 28,199 - ru: 13,935 - zh: 8,615 - de: 6,145 - fr: 3,880 - pt-BR: 2,699 - th: 1,560 - ca: 1,283 - it: 943 - uk-UA: 845 - ja: 788 - pl: 435 - eo: 295 - eu: 274 - vi: 207 - fi: 138 - hu: 113 - ar: 80 - nl: 72 - da: 44 - tr: 37 - ko: 24 - he: 24 - id: 12 - cs: 12 - bn: 1 - sv: 1
Fiizy/Diizy
--- license: afl-3.0 ---
dog/fuego-20230214-041117-63ec52
--- tags: - fuego fuego: id: 20230214-041117-63ec52 status: preparing script: run.py requirements_file: requirements.txt space_id: dog/fuego-20230214-041117-63ec52 space_hardware: cpu-basic ---
AlekseyKorshuk/synthetic-friendly-characters
--- dataset_info: features: - name: name dtype: string - name: categories sequence: string - name: personalities sequence: string - name: description dtype: string - name: conversation list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 10379252 num_examples: 3871 download_size: 5610826 dataset_size: 10379252 --- # Dataset Card for "synthetic-friendly-characters" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
chenai2/chenai-oushi-jiaju-controlnet-dataset
--- dataset_info: features: - name: original_image dtype: image - name: condtioning_image dtype: image - name: caption dtype: string splits: - name: train num_bytes: 57901218.0 num_examples: 104 download_size: 57912318 dataset_size: 57901218.0 --- # Dataset Card for "chenai-oushi-jiaju-controlnet-dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
MrDexxter/ArtDataset
--- license: apache-2.0 ---
open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-7b-ep3-v2
--- pretty_name: Evaluation run of namirocks/mistral-shishya-all-hal-7b-ep3-v2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [namirocks/mistral-shishya-all-hal-7b-ep3-v2](https://huggingface.co/namirocks/mistral-shishya-all-hal-7b-ep3-v2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-7b-ep3-v2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-27T17:40:02.776744](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-7b-ep3-v2/blob/main/results_2024-01-27T17-40-02.776744.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.31023788017540777,\n\ \ \"acc_stderr\": 0.032160633442898136,\n \"acc_norm\": 0.31223579950852837,\n\ \ \"acc_norm_stderr\": 0.03302401738622922,\n \"mc1\": 0.25703794369645044,\n\ \ \"mc1_stderr\": 0.015298077509485081,\n \"mc2\": 0.3970787935629283,\n\ \ \"mc2_stderr\": 0.0146143299284997\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.4300341296928328,\n \"acc_stderr\": 0.014467631559137994,\n\ \ \"acc_norm\": 0.4590443686006826,\n \"acc_norm_stderr\": 0.01456229107360123\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.576777534355706,\n\ \ \"acc_stderr\": 0.004930603061590765,\n \"acc_norm\": 0.7428799044015136,\n\ \ \"acc_norm_stderr\": 0.004361529679492746\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\ \ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.362962962962963,\n\ \ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\ \ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\ \ \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \ \ \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.30566037735849055,\n \"acc_stderr\": 0.028353298073322663,\n\ \ \"acc_norm\": 0.30566037735849055,\n \"acc_norm_stderr\": 0.028353298073322663\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n\ \ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n\ \ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n\ \ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\ \ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\ \ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\ \ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745653,\n\ \ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745653\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\ \ \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n\ \ \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378949,\n\ \ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378949\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577657,\n \"\ acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577657\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\ \ \"acc_stderr\": 0.03567016675276862,\n \"acc_norm\": 0.1984126984126984,\n\ \ \"acc_norm_stderr\": 0.03567016675276862\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.3161290322580645,\n \"acc_stderr\": 0.026450874489042764,\n \"\ acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.026450874489042764\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n \"\ acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\ : 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.3878787878787879,\n \"acc_stderr\": 0.03804913653971011,\n\ \ \"acc_norm\": 0.3878787878787879,\n \"acc_norm_stderr\": 0.03804913653971011\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626303,\n \"\ acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626303\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.31088082901554404,\n \"acc_stderr\": 0.03340361906276587,\n\ \ \"acc_norm\": 0.31088082901554404,\n \"acc_norm_stderr\": 0.03340361906276587\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.30512820512820515,\n \"acc_stderr\": 0.023346335293325887,\n\ \ \"acc_norm\": 0.30512820512820515,\n \"acc_norm_stderr\": 0.023346335293325887\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371216,\n \ \ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371216\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150006,\n\ \ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150006\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\ acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3577981651376147,\n \"acc_stderr\": 0.020552060784827818,\n \"\ acc_norm\": 0.3577981651376147,\n \"acc_norm_stderr\": 0.020552060784827818\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\ acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.4068627450980392,\n \"acc_stderr\": 0.03447891136353383,\n \"\ acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.03447891136353383\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.4472573839662447,\n \"acc_stderr\": 0.03236564251614192,\n \ \ \"acc_norm\": 0.4472573839662447,\n \"acc_norm_stderr\": 0.03236564251614192\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\ \ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\ \ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.040103589424622034,\n\ \ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.040103589424622034\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\ acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664742,\n\ \ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664742\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\ \ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.37606837606837606,\n\ \ \"acc_stderr\": 0.03173393632969481,\n \"acc_norm\": 0.37606837606837606,\n\ \ \"acc_norm_stderr\": 0.03173393632969481\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.46360153256704983,\n\ \ \"acc_stderr\": 0.01783252407959326,\n \"acc_norm\": 0.46360153256704983,\n\ \ \"acc_norm_stderr\": 0.01783252407959326\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.02386800326250012,\n\ \ \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.02386800326250012\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\ \ \"acc_stderr\": 0.014572650383409153,\n \"acc_norm\": 0.2547486033519553,\n\ \ \"acc_norm_stderr\": 0.014572650383409153\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242557,\n\ \ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242557\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31189710610932475,\n\ \ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.31189710610932475,\n\ \ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.024748624490537375,\n\ \ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.024748624490537375\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.24468085106382978,\n \"acc_stderr\": 0.02564555362226673,\n \ \ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.02564555362226673\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\ \ \"acc_stderr\": 0.011025499291443737,\n \"acc_norm\": 0.24771838331160365,\n\ \ \"acc_norm_stderr\": 0.011025499291443737\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\ \ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2581699346405229,\n \"acc_stderr\": 0.01770453165325007,\n \ \ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.01770453165325007\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\ \ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\ \ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.02635891633490405,\n\ \ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.02635891633490405\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.35323383084577115,\n\ \ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.35323383084577115,\n\ \ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.24096385542168675,\n\ \ \"acc_stderr\": 0.0332939411907353,\n \"acc_norm\": 0.24096385542168675,\n\ \ \"acc_norm_stderr\": 0.0332939411907353\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.0381107966983353,\n\ \ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.0381107966983353\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\ \ \"mc1_stderr\": 0.015298077509485081,\n \"mc2\": 0.3970787935629283,\n\ \ \"mc2_stderr\": 0.0146143299284997\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.6977111286503551,\n \"acc_stderr\": 0.012907200361627538\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/namirocks/mistral-shishya-all-hal-7b-ep3-v2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|arc:challenge|25_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-27T17-40-02.776744.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|gsm8k|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hellaswag|10_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-27T17-40-02.776744.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-management|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T17-40-02.776744.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|truthfulqa:mc|0_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-27T17-40-02.776744.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_27T17_40_02.776744 path: - '**/details_harness|winogrande|5_2024-01-27T17-40-02.776744.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-27T17-40-02.776744.parquet' - config_name: results data_files: - split: 2024_01_27T17_40_02.776744 path: - results_2024-01-27T17-40-02.776744.parquet - split: latest path: - results_2024-01-27T17-40-02.776744.parquet --- # Dataset Card for Evaluation run of namirocks/mistral-shishya-all-hal-7b-ep3-v2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [namirocks/mistral-shishya-all-hal-7b-ep3-v2](https://huggingface.co/namirocks/mistral-shishya-all-hal-7b-ep3-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-7b-ep3-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-27T17:40:02.776744](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__mistral-shishya-all-hal-7b-ep3-v2/blob/main/results_2024-01-27T17-40-02.776744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.31023788017540777, "acc_stderr": 0.032160633442898136, "acc_norm": 0.31223579950852837, "acc_norm_stderr": 0.03302401738622922, "mc1": 0.25703794369645044, "mc1_stderr": 0.015298077509485081, "mc2": 0.3970787935629283, "mc2_stderr": 0.0146143299284997 }, "harness|arc:challenge|25": { "acc": 0.4300341296928328, "acc_stderr": 0.014467631559137994, "acc_norm": 0.4590443686006826, "acc_norm_stderr": 0.01456229107360123 }, "harness|hellaswag|10": { "acc": 0.576777534355706, "acc_stderr": 0.004930603061590765, "acc_norm": 0.7428799044015136, "acc_norm_stderr": 0.004361529679492746 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.362962962962963, "acc_stderr": 0.041539484047424, "acc_norm": 0.362962962962963, "acc_norm_stderr": 0.041539484047424 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.34, "acc_stderr": 0.047609522856952344, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952344 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.30566037735849055, "acc_stderr": 0.028353298073322663, "acc_norm": 0.30566037735849055, "acc_norm_stderr": 0.028353298073322663 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3263888888888889, "acc_stderr": 0.03921067198982266, "acc_norm": 0.3263888888888889, "acc_norm_stderr": 0.03921067198982266 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.26, "acc_stderr": 0.04408440022768077, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.24277456647398843, "acc_stderr": 0.0326926380614177, "acc_norm": 0.24277456647398843, "acc_norm_stderr": 0.0326926380614177 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.33191489361702126, "acc_stderr": 0.030783736757745653, "acc_norm": 0.33191489361702126, "acc_norm_stderr": 0.030783736757745653 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.0409698513984367, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.0409698513984367 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3103448275862069, "acc_stderr": 0.03855289616378949, "acc_norm": 0.3103448275862069, "acc_norm_stderr": 0.03855289616378949 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2671957671957672, "acc_stderr": 0.02278967314577657, "acc_norm": 0.2671957671957672, "acc_norm_stderr": 0.02278967314577657 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1984126984126984, "acc_stderr": 0.03567016675276862, "acc_norm": 0.1984126984126984, "acc_norm_stderr": 0.03567016675276862 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3161290322580645, "acc_stderr": 0.026450874489042764, "acc_norm": 0.3161290322580645, "acc_norm_stderr": 0.026450874489042764 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2512315270935961, "acc_stderr": 0.030516530732694436, "acc_norm": 0.2512315270935961, "acc_norm_stderr": 0.030516530732694436 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.3878787878787879, "acc_stderr": 0.03804913653971011, "acc_norm": 0.3878787878787879, "acc_norm_stderr": 0.03804913653971011 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3383838383838384, "acc_stderr": 0.03371124142626303, "acc_norm": 0.3383838383838384, "acc_norm_stderr": 0.03371124142626303 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.31088082901554404, "acc_stderr": 0.03340361906276587, "acc_norm": 0.31088082901554404, "acc_norm_stderr": 0.03340361906276587 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.30512820512820515, "acc_stderr": 0.023346335293325887, "acc_norm": 0.30512820512820515, "acc_norm_stderr": 0.023346335293325887 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.02671924078371216, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.02671924078371216 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.36134453781512604, "acc_stderr": 0.031204691225150006, "acc_norm": 0.36134453781512604, "acc_norm_stderr": 0.031204691225150006 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.036313298039696525, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.036313298039696525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3577981651376147, "acc_stderr": 0.020552060784827818, "acc_norm": 0.3577981651376147, "acc_norm_stderr": 0.020552060784827818 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.03293377139415191, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.03293377139415191 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4068627450980392, "acc_stderr": 0.03447891136353383, "acc_norm": 0.4068627450980392, "acc_norm_stderr": 0.03447891136353383 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.4472573839662447, "acc_stderr": 0.03236564251614192, "acc_norm": 0.4472573839662447, "acc_norm_stderr": 0.03236564251614192 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3901345291479821, "acc_stderr": 0.03273766725459157, "acc_norm": 0.3901345291479821, "acc_norm_stderr": 0.03273766725459157 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.29770992366412213, "acc_stderr": 0.040103589424622034, "acc_norm": 0.29770992366412213, "acc_norm_stderr": 0.040103589424622034 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2231404958677686, "acc_stderr": 0.03800754475228733, "acc_norm": 0.2231404958677686, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2822085889570552, "acc_stderr": 0.03536117886664742, "acc_norm": 0.2822085889570552, "acc_norm_stderr": 0.03536117886664742 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340456, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340456 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.37606837606837606, "acc_stderr": 0.03173393632969481, "acc_norm": 0.37606837606837606, "acc_norm_stderr": 0.03173393632969481 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.46360153256704983, "acc_stderr": 0.01783252407959326, "acc_norm": 0.46360153256704983, "acc_norm_stderr": 0.01783252407959326 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.26878612716763006, "acc_stderr": 0.02386800326250012, "acc_norm": 0.26878612716763006, "acc_norm_stderr": 0.02386800326250012 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2547486033519553, "acc_stderr": 0.014572650383409153, "acc_norm": 0.2547486033519553, "acc_norm_stderr": 0.014572650383409153 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2679738562091503, "acc_stderr": 0.025360603796242557, "acc_norm": 0.2679738562091503, "acc_norm_stderr": 0.025360603796242557 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.31189710610932475, "acc_stderr": 0.026311858071854155, "acc_norm": 0.31189710610932475, "acc_norm_stderr": 0.026311858071854155 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2716049382716049, "acc_stderr": 0.024748624490537375, "acc_norm": 0.2716049382716049, "acc_norm_stderr": 0.024748624490537375 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24468085106382978, "acc_stderr": 0.02564555362226673, "acc_norm": 0.24468085106382978, "acc_norm_stderr": 0.02564555362226673 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24771838331160365, "acc_stderr": 0.011025499291443737, "acc_norm": 0.24771838331160365, "acc_norm_stderr": 0.011025499291443737 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4485294117647059, "acc_stderr": 0.030211479609121593, "acc_norm": 0.4485294117647059, "acc_norm_stderr": 0.030211479609121593 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2581699346405229, "acc_stderr": 0.01770453165325007, "acc_norm": 0.2581699346405229, "acc_norm_stderr": 0.01770453165325007 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2909090909090909, "acc_stderr": 0.04350271442923243, "acc_norm": 0.2909090909090909, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2163265306122449, "acc_stderr": 0.02635891633490405, "acc_norm": 0.2163265306122449, "acc_norm_stderr": 0.02635891633490405 }, "harness|hendrycksTest-sociology|5": { "acc": 0.35323383084577115, "acc_stderr": 0.03379790611796777, "acc_norm": 0.35323383084577115, "acc_norm_stderr": 0.03379790611796777 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-virology|5": { "acc": 0.24096385542168675, "acc_stderr": 0.0332939411907353, "acc_norm": 0.24096385542168675, "acc_norm_stderr": 0.0332939411907353 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.4444444444444444, "acc_stderr": 0.0381107966983353, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.0381107966983353 }, "harness|truthfulqa:mc|0": { "mc1": 0.25703794369645044, "mc1_stderr": 0.015298077509485081, "mc2": 0.3970787935629283, "mc2_stderr": 0.0146143299284997 }, "harness|winogrande|5": { "acc": 0.6977111286503551, "acc_stderr": 0.012907200361627538 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
MocktaiLEngineer/qmsum-processed
--- license: mit ---
liuyanchen1015/MULTI_VALUE_mnli_it_dobj
--- dataset_info: features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: score dtype: int64 splits: - name: dev_matched num_bytes: 109953 num_examples: 433 - name: dev_mismatched num_bytes: 124828 num_examples: 477 - name: test_matched num_bytes: 93451 num_examples: 374 - name: test_mismatched num_bytes: 121786 num_examples: 473 - name: train num_bytes: 4430384 num_examples: 16783 download_size: 2940034 dataset_size: 4880402 --- # Dataset Card for "MULTI_VALUE_mnli_it_dobj" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Rezuwan/Parrot60_Dataset
--- license: apache-2.0 ---
poorfish/fishdataset2
--- license: mit task_categories: - text-classification language: - cs tags: - finance pretty_name: medic size_categories: - 100M<n<1B ---
AndreasTo/MyTestDataset
--- license: apache-2.0 ---
one-sec-cv12/chunk_121
--- dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 splits: - name: train num_bytes: 26670320496.375 num_examples: 277677 download_size: 24903690762 dataset_size: 26670320496.375 --- # Dataset Card for "chunk_121" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Hack90/ncbi_genbank_part_8
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: id dtype: string - name: sequence dtype: string - name: name dtype: string - name: description dtype: string - name: features dtype: int64 - name: seq_length dtype: int64 splits: - name: train num_bytes: 19567803802 num_examples: 10984 download_size: 9068866549 dataset_size: 19567803802 --- # Dataset Card for "ncbi_genbank_part_8" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_cloudyu__mistral_15B_instruct_v0.1
--- pretty_name: Evaluation run of cloudyu/mistral_15B_instruct_v0.1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [cloudyu/mistral_15B_instruct_v0.1](https://huggingface.co/cloudyu/mistral_15B_instruct_v0.1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__mistral_15B_instruct_v0.1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-05T01:26:23.846966](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__mistral_15B_instruct_v0.1/blob/main/results_2024-03-05T01-26-23.846966.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5970175289044487,\n\ \ \"acc_stderr\": 0.03326496903990425,\n \"acc_norm\": 0.6016631223280173,\n\ \ \"acc_norm_stderr\": 0.03394368498901014,\n \"mc1\": 0.4589963280293758,\n\ \ \"mc1_stderr\": 0.017444544447661196,\n \"mc2\": 0.6342984767656984,\n\ \ \"mc2_stderr\": 0.015283636124581295\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633827,\n\ \ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216384\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6253734315873332,\n\ \ \"acc_stderr\": 0.004830371317841051,\n \"acc_norm\": 0.8170683130850428,\n\ \ \"acc_norm_stderr\": 0.003858203851819931\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\ \ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\ \ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\ \ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\ \ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\ \ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\ \ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\ \ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\ \ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\ \ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\ \ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\ \ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\ \ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\ \ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\ \ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\ acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\ \ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\ \ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\ \ \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.6387096774193548,\n\ \ \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\ : 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\ \ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\ acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153324,\n\ \ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153324\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \ \ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \ \ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\ \ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\ acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7743119266055046,\n \"acc_stderr\": 0.01792308766780306,\n \"\ acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.01792308766780306\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\ acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\ acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035307,\n \ \ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035307\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\ \ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\ \ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\ \ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\ \ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\ \ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \ \ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n\ \ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\ \ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\ \ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n\ \ \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n\ \ \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069713,\n\ \ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069713\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n\ \ \"acc_stderr\": 0.015652542496421132,\n \"acc_norm\": 0.3240223463687151,\n\ \ \"acc_norm_stderr\": 0.015652542496421132\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615697,\n\ \ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615697\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\ \ \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n\ \ \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n\ \ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \ \ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41460234680573665,\n\ \ \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.41460234680573665,\n\ \ \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159696,\n\ \ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159696\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6274509803921569,\n \"acc_stderr\": 0.01955964680921593,\n \ \ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.01955964680921593\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\ \ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\ \ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\ \ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\ \ \"acc_stderr\": 0.032801882053486435,\n \"acc_norm\": 0.6865671641791045,\n\ \ \"acc_norm_stderr\": 0.032801882053486435\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\ \ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\ \ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\ \ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4589963280293758,\n\ \ \"mc1_stderr\": 0.017444544447661196,\n \"mc2\": 0.6342984767656984,\n\ \ \"mc2_stderr\": 0.015283636124581295\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3858984078847612,\n \ \ \"acc_stderr\": 0.013409077471319178\n }\n}\n```" repo_url: https://huggingface.co/cloudyu/mistral_15B_instruct_v0.1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|arc:challenge|25_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-05T01-26-23.846966.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|gsm8k|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hellaswag|10_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-05T01-26-23.846966.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-management|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T01-26-23.846966.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|truthfulqa:mc|0_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-05T01-26-23.846966.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_05T01_26_23.846966 path: - '**/details_harness|winogrande|5_2024-03-05T01-26-23.846966.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-05T01-26-23.846966.parquet' - config_name: results data_files: - split: 2024_03_05T01_26_23.846966 path: - results_2024-03-05T01-26-23.846966.parquet - split: latest path: - results_2024-03-05T01-26-23.846966.parquet --- # Dataset Card for Evaluation run of cloudyu/mistral_15B_instruct_v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [cloudyu/mistral_15B_instruct_v0.1](https://huggingface.co/cloudyu/mistral_15B_instruct_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_cloudyu__mistral_15B_instruct_v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-05T01:26:23.846966](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__mistral_15B_instruct_v0.1/blob/main/results_2024-03-05T01-26-23.846966.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5970175289044487, "acc_stderr": 0.03326496903990425, "acc_norm": 0.6016631223280173, "acc_norm_stderr": 0.03394368498901014, "mc1": 0.4589963280293758, "mc1_stderr": 0.017444544447661196, "mc2": 0.6342984767656984, "mc2_stderr": 0.015283636124581295 }, "harness|arc:challenge|25": { "acc": 0.5477815699658704, "acc_stderr": 0.014544519880633827, "acc_norm": 0.5844709897610921, "acc_norm_stderr": 0.014401366641216384 }, "harness|hellaswag|10": { "acc": 0.6253734315873332, "acc_stderr": 0.004830371317841051, "acc_norm": 0.8170683130850428, "acc_norm_stderr": 0.003858203851819931 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847415, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847415 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.618421052631579, "acc_stderr": 0.03953173377749194, "acc_norm": 0.618421052631579, "acc_norm_stderr": 0.03953173377749194 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.660377358490566, "acc_stderr": 0.02914690474779833, "acc_norm": 0.660377358490566, "acc_norm_stderr": 0.02914690474779833 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6944444444444444, "acc_stderr": 0.03852084696008534, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.03852084696008534 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5664739884393064, "acc_stderr": 0.03778621079092056, "acc_norm": 0.5664739884393064, "acc_norm_stderr": 0.03778621079092056 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.048580835742663454, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.048580835742663454 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.032650194750335815, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.032650194750335815 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3968253968253968, "acc_stderr": 0.02519710107424649, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.02519710107424649 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6387096774193548, "acc_stderr": 0.02732754844795754, "acc_norm": 0.6387096774193548, "acc_norm_stderr": 0.02732754844795754 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.0347769116216366, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.0347769116216366 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7424242424242424, "acc_stderr": 0.03115626951964683, "acc_norm": 0.7424242424242424, "acc_norm_stderr": 0.03115626951964683 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.844559585492228, "acc_stderr": 0.026148483469153324, "acc_norm": 0.844559585492228, "acc_norm_stderr": 0.026148483469153324 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.541025641025641, "acc_stderr": 0.025265525491284295, "acc_norm": 0.541025641025641, "acc_norm_stderr": 0.025265525491284295 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.031041941304059278, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.031041941304059278 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7743119266055046, "acc_stderr": 0.01792308766780306, "acc_norm": 0.7743119266055046, "acc_norm_stderr": 0.01792308766780306 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7450980392156863, "acc_stderr": 0.03058759135160425, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.03058759135160425 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.028304657943035307, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.028304657943035307 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835794, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6870229007633588, "acc_stderr": 0.04066962905677698, "acc_norm": 0.6870229007633588, "acc_norm_stderr": 0.04066962905677698 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.044986763205729245, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.044986763205729245 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841407, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841407 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7943805874840357, "acc_stderr": 0.01445250045678583, "acc_norm": 0.7943805874840357, "acc_norm_stderr": 0.01445250045678583 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.684971098265896, "acc_stderr": 0.025009313790069713, "acc_norm": 0.684971098265896, "acc_norm_stderr": 0.025009313790069713 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3240223463687151, "acc_stderr": 0.015652542496421132, "acc_norm": 0.3240223463687151, "acc_norm_stderr": 0.015652542496421132 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6699346405228758, "acc_stderr": 0.026925654653615697, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.026925654653615697 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.02623696588115326, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.02623696588115326 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6944444444444444, "acc_stderr": 0.025630824975621344, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.025630824975621344 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.029779450957303062, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.029779450957303062 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41460234680573665, "acc_stderr": 0.012582597058908284, "acc_norm": 0.41460234680573665, "acc_norm_stderr": 0.012582597058908284 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6102941176470589, "acc_stderr": 0.029624663581159696, "acc_norm": 0.6102941176470589, "acc_norm_stderr": 0.029624663581159696 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6274509803921569, "acc_stderr": 0.01955964680921593, "acc_norm": 0.6274509803921569, "acc_norm_stderr": 0.01955964680921593 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.029043088683304328, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.029043088683304328 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6865671641791045, "acc_stderr": 0.032801882053486435, "acc_norm": 0.6865671641791045, "acc_norm_stderr": 0.032801882053486435 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333047, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333047 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.4589963280293758, "mc1_stderr": 0.017444544447661196, "mc2": 0.6342984767656984, "mc2_stderr": 0.015283636124581295 }, "harness|winogrande|5": { "acc": 0.7624309392265194, "acc_stderr": 0.011961298905803152 }, "harness|gsm8k|5": { "acc": 0.3858984078847612, "acc_stderr": 0.013409077471319178 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
dongyoung4091/hh-rlhf_with_features_fail-to-consider-context
--- dataset_info: features: - name: chosen dtype: string - name: rejected dtype: string - name: chosen_value dtype: float64 - name: rejected_value dtype: float64 splits: - name: train num_bytes: 13454657 num_examples: 19148 download_size: 8012608 dataset_size: 13454657 --- # Dataset Card for "hh-rlhf_with_features_fail-to-consider-context" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/hai_tien_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of hai_tien/海天/海天 (Azur Lane) This is the dataset of hai_tien/海天/海天 (Azur Lane), containing 16 images and their tags. The core tags of this character are `breasts, long_hair, bangs, hair_ornament, hair_flower, yellow_eyes, white_hair, very_long_hair, black_hair, multicolored_hair, grey_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 16 | 28.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hai_tien_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 16 | 13.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hai_tien_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 32 | 25.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hai_tien_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 16 | 23.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hai_tien_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 32 | 37.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hai_tien_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/hai_tien_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, bare_shoulders, cleavage, full_body, long_sleeves, wide_sleeves, holding, white_dress, closed_mouth, jewelry, single_thighhigh, white_thighhighs, detached_sleeves, criss-cross_halter, white_background, white_flower | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | bare_shoulders | cleavage | full_body | long_sleeves | wide_sleeves | holding | white_dress | closed_mouth | jewelry | single_thighhigh | white_thighhighs | detached_sleeves | criss-cross_halter | white_background | white_flower | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------------|:-----------|:------------|:---------------|:---------------|:----------|:--------------|:---------------|:----------|:-------------------|:-------------------|:-------------------|:---------------------|:-------------------|:---------------| | 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-42000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 666081 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
liuyanchen1015/MULTI_VALUE_mrpc_his_him
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 33256 num_examples: 126 - name: train num_bytes: 76451 num_examples: 286 - name: validation num_bytes: 9231 num_examples: 34 download_size: 89524 dataset_size: 118938 --- # Dataset Card for "MULTI_VALUE_mrpc_his_him" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
SaiedAlshahrani/Wikipedia-Corpora-Report
--- license: mit pretty_name: Wikipedia-Corpora-Report size_categories: - 1K<n<10K --- # Dataset Card for "Wikipedia-Corpora-Report" This dataset is used as a metadata database for the online [WIKIPEDIA CORPORA META REPORT](https://wikipedia-corpora-report.streamlit.app/) dashboard that illustrates how humans and bots generate or edit Wikipedia editions and provides metrics for “pages” and “edits” for all Wikipedia editions (320 languages). The “pages” metric counts articles and non-articles, while the “edits” metric tallies edits on articles and non-articles, all categorized by contributor type: humans or bots. The metadata is downloaded from [Wikimedia Statistics](https://stats.wikimedia.org/#/all-projects), then processed and uploaded to the Hugging Face Hub as a dataset. For more details about the dataset, please **read** and **cite** our paper: ```bash @inproceedings{alshahrani-etal-2023-performance, title = "{Performance Implications of Using Unrepresentative Corpora in {A}rabic Natural Language Processing}", author = "Alshahrani, Saied and Alshahrani, Norah and Dey, Soumyabrata and Matthews, Jeanna", booktitle = "Proceedings of the The First Arabic Natural Language Processing Conference (ArabicNLP 2023)", month = December, year = "2023", address = "Singapore (Hybrid)", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2023.arabicnlp-1.19", doi = "10.18653/v1/2023.arabicnlp-1.19", pages = "218--231", abstract = "Wikipedia articles are a widely used source of training data for Natural Language Processing (NLP) research, particularly as corpora for low-resource languages like Arabic. However, it is essential to understand the extent to which these corpora reflect the representative contributions of native speakers, especially when many entries in a given language are directly translated from other languages or automatically generated through automated mechanisms. In this paper, we study the performance implications of using inorganic corpora that are not representative of native speakers and are generated through automated techniques such as bot generation or automated template-based translation. The case of the Arabic Wikipedia editions gives a unique case study of this since the Moroccan Arabic Wikipedia edition (ARY) is small but representative, the Egyptian Arabic Wikipedia edition (ARZ) is large but unrepresentative, and the Modern Standard Arabic Wikipedia edition (AR) is both large and more representative. We intrinsically evaluate the performance of two main NLP upstream tasks, namely word representation and language modeling, using word analogy evaluations and fill-mask evaluations using our two newly created datasets: Arab States Analogy Dataset (ASAD) and Masked Arab States Dataset (MASD). We demonstrate that for good NLP performance, we need both large and organic corpora; neither alone is sufficient. We show that producing large corpora through automated means can be a counter-productive, producing models that both perform worse and lack cultural richness and meaningful representation of the Arabic language and its native speakers.", }
karol-skorulski/controlnet-apparel-dataset
--- dataset_info: features: - name: image dtype: image - name: conditioning_image dtype: image - name: text dtype: string splits: - name: train num_bytes: 149913420.0 num_examples: 1000 download_size: 149765339 dataset_size: 149913420.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
irds/clueweb12_b13_ntcir-www-3
--- pretty_name: '`clueweb12/b13/ntcir-www-3`' viewer: false source_datasets: ['irds/clueweb12_b13'] task_categories: - text-retrieval --- # Dataset Card for `clueweb12/b13/ntcir-www-3` The `clueweb12/b13/ntcir-www-3` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/clueweb12#clueweb12/b13/ntcir-www-3). # Data This dataset provides: - `queries` (i.e., topics); count=160 - For `docs`, use [`irds/clueweb12_b13`](https://huggingface.co/datasets/irds/clueweb12_b13) ## Usage ```python from datasets import load_dataset queries = load_dataset('irds/clueweb12_b13_ntcir-www-3', 'queries') for record in queries: record # {'query_id': ..., 'title': ..., 'description': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format.
baruga/pxlgrl
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 30516207.0 num_examples: 21 download_size: 30518111 dataset_size: 30516207.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
BangumiBase/xxxholic
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Xxxholic This is the image base of bangumi xxxHOLiC, we detected 36 characters, 3967 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 2265 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 70 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 20 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 189 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 20 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 23 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 11 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 16 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 27 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 9 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 59 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 94 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 20 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 67 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 33 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 48 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 543 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 29 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 66 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 16 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 26 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 11 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 30 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 29 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 31 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 9 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 12 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 7 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | N/A | | 28 | 6 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | N/A | N/A | | 29 | 8 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | ![preview 7](29/preview_7.png) | ![preview 8](29/preview_8.png) | | 30 | 39 | [Download](30/dataset.zip) | ![preview 1](30/preview_1.png) | ![preview 2](30/preview_2.png) | ![preview 3](30/preview_3.png) | ![preview 4](30/preview_4.png) | ![preview 5](30/preview_5.png) | ![preview 6](30/preview_6.png) | ![preview 7](30/preview_7.png) | ![preview 8](30/preview_8.png) | | 31 | 23 | [Download](31/dataset.zip) | ![preview 1](31/preview_1.png) | ![preview 2](31/preview_2.png) | ![preview 3](31/preview_3.png) | ![preview 4](31/preview_4.png) | ![preview 5](31/preview_5.png) | ![preview 6](31/preview_6.png) | ![preview 7](31/preview_7.png) | ![preview 8](31/preview_8.png) | | 32 | 7 | [Download](32/dataset.zip) | ![preview 1](32/preview_1.png) | ![preview 2](32/preview_2.png) | ![preview 3](32/preview_3.png) | ![preview 4](32/preview_4.png) | ![preview 5](32/preview_5.png) | ![preview 6](32/preview_6.png) | ![preview 7](32/preview_7.png) | N/A | | 33 | 14 | [Download](33/dataset.zip) | ![preview 1](33/preview_1.png) | ![preview 2](33/preview_2.png) | ![preview 3](33/preview_3.png) | ![preview 4](33/preview_4.png) | ![preview 5](33/preview_5.png) | ![preview 6](33/preview_6.png) | ![preview 7](33/preview_7.png) | ![preview 8](33/preview_8.png) | | 34 | 21 | [Download](34/dataset.zip) | ![preview 1](34/preview_1.png) | ![preview 2](34/preview_2.png) | ![preview 3](34/preview_3.png) | ![preview 4](34/preview_4.png) | ![preview 5](34/preview_5.png) | ![preview 6](34/preview_6.png) | ![preview 7](34/preview_7.png) | ![preview 8](34/preview_8.png) | | noise | 69 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
hackathon-somos-nlp-2023/alpaca-es-auto-filter
--- dataset_info: features: - name: text dtype: 'null' - name: inputs struct: - name: 1-instruction dtype: string - name: 2-input dtype: string - name: 3-output dtype: string - name: prediction dtype: 'null' - name: prediction_agent dtype: 'null' - name: annotation dtype: string - name: annotation_agent dtype: string - name: vectors struct: - name: input sequence: float64 - name: instruction sequence: float64 - name: output sequence: float64 - name: multi_label dtype: bool - name: explanation dtype: 'null' - name: id dtype: string - name: metadata struct: - name: bias_score.label dtype: string - name: bias_score.score dtype: float64 - name: en_index dtype: int64 - name: hate_score.label dtype: string - name: hate_score.score dtype: float64 - name: sf-multi-unprocessable-score dtype: float64 - name: sf-unprocessable-score dtype: float64 - name: tr-flag-1-instruction dtype: bool - name: tr-flag-2-input dtype: bool - name: tr-flag-3-output dtype: bool - name: status dtype: string - name: event_timestamp dtype: timestamp[us] - name: metrics struct: - name: text_length dtype: int64 splits: - name: train num_bytes: 986677188 num_examples: 51942 download_size: 653488377 dataset_size: 986677188 --- # Dataset Card for "alpaca-es-hackaton" La base de datos original ha sido proporcionada por SomosNLP y se encuentra [aquí](https://huggingface.co/datasets/somosnlp/somos-clean-alpaca-es) Este conjunto de datos es una traducción del dataset Clean Alpaca al Español y sirve como referencia para el esfuerzo colaborativo de limpieza y mejora del dataset durante el [Hackathon Somos NLP 2023](https://somosnlp.org/hackathon). *Nota: No es necesario participar en el hackathon para contribuir a esta tarea.* Los scripts, modelos y, en general, el código asociado a estas tareas se puede encontrar en el Github de [Burra](https://github.com/maxserras/burra) En este reto, hemos anotado manualmente unos cuantos ejemplos y hemos analizado de manera semi-automática el dataset para identificar ciertas inconsistencias. 1- Instrucciones mal traducidas: los ejemplos que tengan la misma instrucción tanto en el corpus de [inglés](https://github.com/maxserras/burra/blob/master/corpus/alpaca_data_cleaned.json) como el de castellano han sido etiquetados automáticamente como "BAD INSTRUCTION" 2- Identificación de ejemplos mal traducidos usando LangID, se identifican a nivel de metadatos: - tr-flag-1-instruction: True, para el caso de que la instrucción esté mal traducida - tr-flag-2-input: True, para el caso de que la entrada esté mal traducida - tr-flag-3-outcome: True, no hace falta que lo expliquemos, ¿no? 4- Evaluar dos modelos de setfit para detección de ejemplos no procesables como URLs, Fotografías, Imágenes, y todos aquellos elemenos donde el modelo no tiene capacidad de asimilación. - st-multi-unprocessable-score: float - para la evaluación realizada con el [modelo](https://huggingface.co/hackathon-somos-nlp-2023/setfit-alpaca-es-unprocessable-sample-detection-multi) entrenado sobre la base multilingue. - st-unprocessable-score: float - para la evaluación realizada con el [modelo](https://huggingface.co/hackathon-somos-nlp-2023/setfit-alpaca-es-unprocessable-sample-detection) sin base multilingue. 5- Alinear el corpus de EN y ES a nivel de traducción usando [LASER](https://github.com/facebookresearch/LASER). No todos los elementos han podido ser alineados, pero por lo general, si no se han podido alinear, la traducción o el ejemplo suelen presentar errores, con lo cual recomendamos descartar esos ejemplos. - en_index: int, el metadato asociado al índice de la lista del [corpus inicial en inglés](https://github.com/maxserras/burra/blob/master/corpus/alpaca_data_cleaned.json). 6- Analizar los ejemplos con modelos pre-entrenados de [Bias Detection](https://huggingface.co/d4data/bias-detection-model) y [Hate Speech Detection](https://huggingface.co/Hate-speech-CNERG/bert-base-uncased-hatexplain) y volcar los resultados en los metadatos de: - hate_score.label, hate_score.score - bias_score.label, bias_score.label
ygong/BloomVQA
--- task_categories: - visual-question-answering language: - en size_categories: - 1K<n<10K --- # Dataset Card for BloomVQA ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> BloomVQA is a dataset based on picture stories designed for educating young children. It aims to facilitate comprehensive evaluation and characterization of vision-language models on comprehension tasks. The dataset contains tasks reflecting 6 different levels of comprehension and underlying cognitive processes, as laid out in Bloom's Taxonomy, a classic framework widely adopted in education research. This underlying hierarchical taxonomy enables graded model evaluation, automatic data augmentation and novel metrics characterizing model consistency. The core dataset contains 1200 multiple-choice samples collected via Amazon Mechanical Turk based on 20 picture stories downloaded from Creative Commons resources [Book Dash](https://bookdash.org/) and [Storyweaver](https://storyweaver.org.in/en/). <!-- Provide the basic links for the dataset. --> - **Paper:** [BloomVQA: Assessing Hierarchical Multi-modal Comprehension](https://arxiv.org/abs/2312.12716) ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> Each multiple-choice sample contains 1 question and 4 free-form answers including 1 correct answer and 3 incorrect answers. Each sample is labeled with the title of picture story and the level of comprehension as defined in Bloom's Taxonomy.
TrainingDataPro/facial_keypoint_detection
--- license: cc-by-nc-nd-4.0 task_categories: - image-classification language: - en tags: - code - finance dataset_info: features: - name: image_id dtype: uint32 - name: image dtype: image - name: mask dtype: image - name: key_points dtype: string splits: - name: train num_bytes: 134736982 num_examples: 15 download_size: 129724970 dataset_size: 134736982 --- # Facial Keypoints The dataset is designed for computer vision and machine learning tasks involving the identification and analysis of key points on a human face. It consists of images of human faces, each accompanied by key point annotations in XML format. # Get the dataset ### This is just an example of the data Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/facial-keypoints-detection?utm_source=huggingface&utm_medium=cpc&utm_campaign=facial_keypoint_detection) to discuss your requirements, learn about the price and buy the dataset. ![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2F3d7bd72ae7143ee767c2ec54aabde499%2Fimage_keypoint.png?generation=1683577579318981&alt=media) # Data Format Each image from `FKP` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the key points. For each point, the x and y coordinates are provided, and there is a `Presumed_Location` attribute, indicating whether the point is presumed or accurately defined. # Example of XML file structure ![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2Fb68d405e08a0d5dc6e5c87476758164d%2Fcarbon.png?generation=1684338809077422&alt=media) # Labeled Keypoints **1.** Left eye, the closest point to the nose **2.** Left eye, pupil's center **3.** Left eye, the closest point to the left ear **4.** Right eye, the closest point to the nose **5.** Right eye, pupil's center **6.** Right eye, the closest point to the right ear **7.** Left eyebrow, the closest point to the nose **8.** Left eyebrow, the closest point to the left ear **9.** Right eyebrow, the closest point to the nose **10.** Right eyebrow, the closest point to the right ear **11.** Nose, center **12.** Mouth, left corner point **13.** Mouth, right corner point **14.** Mouth, the highest point in the middle **15.** Mouth, the lowest point in the middle # Keypoint annotation is made in accordance with your requirements. ## [**TrainingData**](https://trainingdata.pro/data-market/facial-keypoints-detection?utm_source=huggingface&utm_medium=cpc&utm_campaign=facial_keypoint_detection) provides high-quality data annotation tailored to your needs More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets** TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets**
Mitsuki-Sakamoto/fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_1.0_seed_3_t_1.0_eval
--- dataset_info: config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1 features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string - name: preference dtype: int64 - name: output_1 dtype: string - name: output_2 dtype: string - name: reward_model_prompt_format dtype: string - name: gen_prompt_format dtype: string - name: gen_kwargs struct: - name: do_sample dtype: bool - name: max_new_tokens dtype: int64 - name: pad_token_id dtype: int64 - name: top_k dtype: int64 - name: top_p dtype: float64 - name: reward_1 dtype: float64 - name: reward_2 dtype: float64 - name: n_samples dtype: int64 - name: reject_select dtype: string - name: index dtype: int64 - name: prompt dtype: string - name: chosen dtype: string - name: rejected dtype: string - name: filtered_epoch dtype: int64 - name: gen_reward dtype: float64 - name: gen_response dtype: string - name: gen_proxy_reward dtype: float64 - name: gen_gold_reward dtype: float64 splits: - name: epoch_0 num_bytes: 44049617 num_examples: 18928 - name: epoch_1 num_bytes: 44641770 num_examples: 18928 - name: epoch_2 num_bytes: 44729466 num_examples: 18928 - name: epoch_3 num_bytes: 44803174 num_examples: 18928 - name: epoch_4 num_bytes: 44843154 num_examples: 18928 - name: epoch_5 num_bytes: 44847644 num_examples: 18928 - name: epoch_6 num_bytes: 44861415 num_examples: 18928 - name: epoch_7 num_bytes: 44849856 num_examples: 18928 - name: epoch_8 num_bytes: 44847396 num_examples: 18928 - name: epoch_9 num_bytes: 44851655 num_examples: 18928 - name: epoch_10 num_bytes: 44847136 num_examples: 18928 - name: epoch_11 num_bytes: 44846823 num_examples: 18928 - name: epoch_12 num_bytes: 44851065 num_examples: 18928 - name: epoch_13 num_bytes: 44847863 num_examples: 18928 - name: epoch_14 num_bytes: 44850967 num_examples: 18928 - name: epoch_15 num_bytes: 44849911 num_examples: 18928 - name: epoch_16 num_bytes: 44848297 num_examples: 18928 - name: epoch_17 num_bytes: 44849929 num_examples: 18928 - name: epoch_18 num_bytes: 44851835 num_examples: 18928 - name: epoch_19 num_bytes: 44851924 num_examples: 18928 - name: epoch_20 num_bytes: 44851701 num_examples: 18928 - name: epoch_21 num_bytes: 44852533 num_examples: 18928 - name: epoch_22 num_bytes: 44850424 num_examples: 18928 - name: epoch_23 num_bytes: 44851692 num_examples: 18928 - name: epoch_24 num_bytes: 44852880 num_examples: 18928 - name: epoch_25 num_bytes: 44852549 num_examples: 18928 - name: epoch_26 num_bytes: 44851982 num_examples: 18928 - name: epoch_27 num_bytes: 44853410 num_examples: 18928 - name: epoch_28 num_bytes: 44853893 num_examples: 18928 - name: epoch_29 num_bytes: 44851923 num_examples: 18928 download_size: 710344574 dataset_size: 1344343884 configs: - config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1 data_files: - split: epoch_0 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-* - split: epoch_1 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-* - split: epoch_2 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-* - split: epoch_3 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-* - split: epoch_4 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-* - split: epoch_5 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-* - split: epoch_6 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-* - split: epoch_7 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-* - split: epoch_8 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-* - split: epoch_9 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-* - split: epoch_10 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-* - split: epoch_11 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-* - split: epoch_12 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-* - split: epoch_13 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-* - split: epoch_14 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-* - split: epoch_15 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-* - split: epoch_16 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-* - split: epoch_17 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-* - split: epoch_18 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-* - split: epoch_19 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-* - split: epoch_20 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-* - split: epoch_21 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-* - split: epoch_22 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-* - split: epoch_23 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-* - split: epoch_24 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-* - split: epoch_25 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-* - split: epoch_26 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-* - split: epoch_27 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-* - split: epoch_28 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-* - split: epoch_29 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-* --- # Dataset Card for "fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_1.0_seed_3_t_1.0_eval" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jeongah/curse-detection-data
--- dataset_info: features: - name: index dtype: int64 - name: document dtype: string - name: ' label' dtype: int64 splits: - name: train num_bytes: 429333 num_examples: 4452 - name: test num_bytes: 106670 num_examples: 1113 download_size: 364473 dataset_size: 536003 --- train 문장 4452개, test 문장 1113개로 구성되어 있습니다. 욕설일 경우 spam 값이 1, 욕설에 해당하지 않는 경우 0으로 라벨링 되어 있습니다. https://github.com/2runo/Curse-detection-data --- dataset_info: features: - name: index dtype: int64 - name: sentence dtype: string - name: ' spam' dtype: int64 splits: - name: train num_bytes: 429333 num_examples: 4452 - name: test num_bytes: 106670 num_examples: 1113 download_size: 364457 dataset_size: 536003 --- # Dataset Card for "curse-detection-data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mask-distilled-one-sec-cv12/chunk_88
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1326659496 num_examples: 260538 download_size: 1352975176 dataset_size: 1326659496 --- # Dataset Card for "chunk_88" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
autoevaluate/autoeval-staging-eval-project-5480d71b-7995085
--- type: predictions tags: - autotrain - evaluation datasets: - cifar10 eval_info: task: image_multi_class_classification model: karthiksv/vit-base-patch16-224-cifar10 metrics: [] dataset_name: cifar10 dataset_config: plain_text dataset_split: test col_mapping: image: img target: label --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Multi-class Image Classification * Model: karthiksv/vit-base-patch16-224-cifar10 * Dataset: cifar10 To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model.
autoevaluate/autoeval-staging-eval-project-50a56796-db2d-4349-bf75-388efb52b967-3634
--- type: predictions tags: - autotrain - evaluation datasets: - glue eval_info: task: binary_classification model: autoevaluate/binary-classification metrics: ['matthews_correlation'] dataset_name: glue dataset_config: sst2 dataset_split: validation col_mapping: text: sentence target: label --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Binary Text Classification * Model: autoevaluate/binary-classification * Dataset: glue * Config: sst2 * Split: validation To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model.
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_95
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1249082876.0 num_examples: 245303 download_size: 1275054606 dataset_size: 1249082876.0 --- # Dataset Card for "chunk_95" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
hanifsyarubany10/LaMini-sample-mistral
--- dataset_info: features: - name: context dtype: string - name: instruction dtype: string - name: response dtype: string - name: instruction_source dtype: string - name: prompt dtype: string - name: input_ids sequence: int32 - name: attention_mask sequence: int8 splits: - name: train num_bytes: 1229906208 num_examples: 482055 download_size: 472588315 dataset_size: 1229906208 configs: - config_name: default data_files: - split: train path: data/train-* ---
WorkWithData/Country_Covid19_Daily_2020
--- license: cc-by-4.0 --- This dataset shows daily cases and deaths from Covid-19 by country in 2020. The dataset can also be found on: https://www.workwithdata.com/dataset?entity=covid_country_daily&f=1&fcol0=date&fop0=includes&fval0=2020 Similar datasets can be found on: https://www.workwithdata.com
dangrebenkin/voxforge-ru-dataset
--- dataset_info: features: - name: transcription dtype: string - name: audio dtype: audio: sampling_rate: 16000 splits: - name: train num_bytes: 1947609729.4653895 num_examples: 6169 - name: test num_bytes: 864278563.4406104 num_examples: 2645 download_size: 2705520657 dataset_size: 2811888292.906 license: apache-2.0 --- ## Dataset audio info - 16000 Hz 16 bit - wav - mono - Russian speech ## Dataset instance structure {'audio': {'path': '/path/to/wav.wav', 'array': array([wav numpy array]), dtype=float32), 'sampling_rate': 16000}, 'transcription': 'транскрипция'} ## Citation @Misc{Voxforge.org, author = {Voxforge.org}, title = {Free Speech... Recognition (Linux, Windows and Mac) - voxforge.org}, howpublished = {\url{[http://www.voxforge.org/]}}, note = {accessed 01/21/2023} } ## Source http://www.voxforge.org/ru/downloads
vhug/xorder_dish
--- dataset_info: features: - name: pixel_values dtype: image - name: label dtype: image splits: - name: train num_bytes: 495377032.0 num_examples: 205 download_size: 0 dataset_size: 495377032.0 --- # Dataset Card for "xorder_dish" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
galsenai/waxal_dataset
--- license: creativeml-openrail-m dataset_info: features: - name: audio dtype: audio - name: label dtype: class_label: names: '0': A canoon '1': A cinj '2': A keen '3': A lanq '4': A ñaaƴ '5': A ñamaak '6': Alaa '7': Bacaac '8': Benn '9': Bálamuk '10': Búbaar '11': Caggal '12': Ceme '13': Ci ginnaaw '14': Ci kanam '15': Ci kow '16': Ci suuf '17': Càmmoñ '18': Darnde '19': Dow '20': Doxal '21': Déedet '22': Eey '23': Esuwa '24': Eyen '25': Eé '26': Fatiya '27': Fukk '28': Funoom '29': Futok '30': Futok di sibaakiir '31': Futok di sigaba '32': Futok di sífeejir '33': Futok di yákon '34': Fácul '35': Garab '36': Goo '37': Hani '38': Jaay '39': Jeegom '40': Jeenay '41': Jeetati '42': Jeeɗiɗi '43': Jik '44': Jiku '45': Joy '46': Juni '47': Junne '48': Juroom '49': Juroom-benn '50': Juroom-ñaar '51': Juroom-ñeent '52': Juroom-ñett '53': Jënd '54': Kakamben '55': Kamay '56': Kanoomen '57': Kákambul '58': Kárir '59': Lal '60': Lees '61': Leng '62': Leɗki '63': Li '64': Mbaamir '65': Mbalndi '66': Nano '67': Naxik '68': Nay '69': Ndaxar '70': Ndeyjoor '71': Ndiga '72': Ndiiƭ '73': Njong '74': O ɓox '75': Picc '76': Rawaandu '77': Sappo '78': Sibaakiir '79': Sigaba '80': Solndu '81': Soodde '82': Sífeejir '83': Tadik '84': Tati '85': Taxawal '86': Teemedere '87': Teemeed '88': Tentaam '89': Tik '90': Took '91': Tus '92': Téemeer '93': Ub /Tëj '94': Ub/Tëj '95': Ubbi /Tijji '96': Udditde '97': Uddude '98': Ujaw '99': Ujunere '100': Ujuum '101': Uñen '102': Waafulet '103': Waaw '104': Weg '105': Wet '106': Wúli '107': Xa-aa '108': Xaj '109': Xarɓaxay '110': Yahdu '111': Yeeso '112': Yeeyde '113': Yákon '114': Ñaamo '115': Ñaar '116': Ñeent '117': Ñett '118': Ɗiɗi '119': Ƥetaa-fo-leng '120': Ƥetaa-naxak '121': Ƥetaa-tadak '122': Ƥetaa-ƭaq '123': Ƥetik - name: translation dtype: string - name: locale_id dtype: int64 - name: transcript dtype: string splits: - name: train num_bytes: 567773923.639 num_examples: 26387 download_size: 546144081 dataset_size: 567773923.639 --- ### Dataset Summary Keyword spotting refers to the task of learning to detect spoken keywords. It interfaces all modern voice-based virtual assistants on the market: Amazon’s Alexa, Apple’s Siri, and the Google Home device. Contrarily to speech recognition models, keyword spotting doesn’t run on the cloud, but directly on the device. The motivation of this paper is to extend the Speech commands dataset (Warden 2018) with African languages. In particular, we are going to focus on 4 Senegalese languages: Wolof, Pulaar, Serer, Diola. The choice of these languages is guided, on the one hand, by their status as languages considered to be the languages of the first generation, that is to say, the first codified languages (endowed with a writing system and considered by the state of Senegal as national languages) with decree n ° 68-871 of July 24, 1968. On the other hand, they represent the languages that are most spoken in Senegal. ### Languages The ID of the languages are the following: - Wolof: `7` - Pulaar: `5` - Serer: `6` - Diola: `3` ## Dataset Structure ```python from datasets import load_dataset dataset = load_dataset("galsenai/waxal_dataset") DatasetDict({ train: Dataset({ features: ['audio', 'label', 'translation', 'locale_id'], num_rows: 26387 }) }) ``` ### Data Fields - `audio`: Audio file in MP3 format - `label`: label of the audio file - `translation` : Translation of the keyword in french - `locale_id`: ID of the language
VASVASVAS/models
--- license: openrail ---
yhavinga/cnn_dailymail_dutch
--- annotations_creators: - no-annotation language_creators: - found language: - nl license: - apache-2.0 multilinguality: - monolingual size_categories: - 100K<n<1M source_datasets: - original task_categories: - summarization task_ids: - news-articles-summarization paperswithcode_id: cnn-daily-mail-1 pretty_name: CNN / Daily Mail train-eval-index: - config: 3.0.0 task: summarization task_id: summarization splits: eval_split: test col_mapping: article: text highlights: target --- # Dataset Card for CNN Dailymail Dutch 🇳🇱🇧🇪 Dataset ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description Note: the data below is from the English version at [cnn_dailymail](https://huggingface.co/datasets/cnn_dailymail). - **Homepage:** - **Repository:** [CNN / DailyMail Dataset repository](https://github.com/abisee/cnn-dailymail) - **Paper:** [Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond](https://papers.nips.cc/paper/5945-teaching-machines-to-read-and-comprehend.pdf), [Get To The Point: Summarization with Pointer-Generator Networks](https://www.aclweb.org/anthology/K16-1028.pdf) - **Leaderboard:** [Papers with Code leaderboard for CNN / Dailymail Dataset](https://paperswithcode.com/sota/document-summarization-on-cnn-daily-mail) - **Point of Contact:** [Abigail See](mailto:abisee@stanford.edu) ### Dataset Summary The CNN / DailyMail Dutch 🇳🇱🇧🇪 Dataset is an English-language dataset translated to Dutch containing just over 300k unique news articles as written by journalists at CNN and the Daily Mail. The current version supports both extractive and abstractive summarization, though the original version was created for machine reading and comprehension and abstractive question answering. *This dataset currently (Aug '22) has a single config, which is config `3.0.0` of [cnn_dailymail](https://huggingface.co/datasets/cnn_dailymail) translated to Dutch with [yhavinga/t5-base-36L-ccmatrix-multi](https://huggingface.co/yhavinga/t5-base-36L-ccmatrix-multi).* ### Supported Tasks and Leaderboards - 'summarization': [Version 3.0.0 of the CNN / DailyMail Dataset](https://www.aclweb.org/anthology/K16-1028.pdf) can be used to train a model for abstractive and extractive summarization ([Version 1.0.0](https://papers.nips.cc/paper/5945-teaching-machines-to-read-and-comprehend.pdf) was developed for machine reading and comprehension and abstractive question answering). The model performance is measured by how high the output summary's [ROUGE](https://huggingface.co/metrics/rouge) score for a given article is when compared to the highlight as written by the original article author. [Zhong et al (2020)](https://www.aclweb.org/anthology/2020.acl-main.552.pdf) report a ROUGE-1 score of 44.41 when testing a model trained for extractive summarization. See the [Papers With Code leaderboard](https://paperswithcode.com/sota/document-summarization-on-cnn-daily-mail) for more models. ### Languages The BCP-47 code for English as generally spoken in the United States is en-US and the BCP-47 code for English as generally spoken in the United Kingdom is en-GB. It is unknown if other varieties of English are represented in the data. ## Dataset Structure ### Data Instances For each instance, there is a string for the article, a string for the highlights, and a string for the id. See the [CNN / Daily Mail dataset viewer](https://huggingface.co/datasets/viewer/?dataset=cnn_dailymail&config=3.0.0) to explore more examples. ``` {'id': '0054d6d30dbcad772e20b22771153a2a9cbeaf62', 'article': '(CNN) -- An American woman died aboard a cruise ship that docked at Rio de Janeiro on Tuesday, the same ship on which 86 passengers previously fell ill, according to the state-run Brazilian news agency, Agencia Brasil. The American tourist died aboard the MS Veendam, owned by cruise operator Holland America. Federal Police told Agencia Brasil that forensic doctors were investigating her death. The ship's doctors told police that the woman was elderly and suffered from diabetes and hypertension, according the agency. The other passengers came down with diarrhea prior to her death during an earlier part of the trip, the ship's doctors said. The Veendam left New York 36 days ago for a South America tour.' 'highlights': 'The elderly woman suffered from diabetes and hypertension, ship's doctors say .\nPreviously, 86 passengers had fallen ill on the ship, Agencia Brasil says .'} ``` The average token count for the articles and the highlights are provided below: | Feature | Mean Token Count | | ---------- | ---------------- | | Article | 781 | | Highlights | 56 | ### Data Fields - `id`: a string containing the heximal formated SHA1 hash of the url where the story was retrieved from - `article`: a string containing the body of the news article - `highlights`: a string containing the highlight of the article as written by the article author ### Data Splits The CNN/DailyMail dataset has 3 splits: _train_, _validation_, and _test_. Below are the statistics for Version 3.0.0 of the dataset. | Dataset Split | Number of Instances in Split | | ------------- | ------------------------------------------- | | Train | 287,113 | | Validation | 13,368 | | Test | 11,490 | ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization The data consists of news articles and highlight sentences. In the question answering setting of the data, the articles are used as the context and entities are hidden one at a time in the highlight sentences, producing Cloze style questions where the goal of the model is to correctly guess which entity in the context has been hidden in the highlight. In the summarization setting, the highlight sentences are concatenated to form a summary of the article. The CNN articles were written between April 2007 and April 2015. The Daily Mail articles were written between June 2010 and April 2015. The code for the original data collection is available at <https://github.com/deepmind/rc-data>. The articles were downloaded using archives of <www.cnn.com> and <www.dailymail.co.uk> on the Wayback Machine. Articles were not included in the Version 1.0.0 collection if they exceeded 2000 tokens. Due to accessibility issues with the Wayback Machine, Kyunghyun Cho has made the datasets available at <https://cs.nyu.edu/~kcho/DMQA/>. An updated version of the code that does not anonymize the data is available at <https://github.com/abisee/cnn-dailymail>. Hermann et al provided their own tokenization script. The script provided by See uses the PTBTokenizer. It also lowercases the text and adds periods to lines missing them. #### Who are the source language producers? The text was written by journalists at CNN and the Daily Mail. ### Annotations The dataset does not contain any additional annotations. #### Annotation process [N/A] #### Who are the annotators? [N/A] ### Personal and Sensitive Information Version 3.0 is not anonymized, so individuals' names can be found in the dataset. Information about the original author is not included in the dataset. ## Considerations for Using the Data ### Social Impact of Dataset The purpose of this dataset is to help develop models that can summarize long paragraphs of text in one or two sentences. This task is useful for efficiently presenting information given a large quantity of text. It should be made clear that any summarizations produced by models trained on this dataset are reflective of the language used in the articles, but are in fact automatically generated. ### Discussion of Biases [Bordia and Bowman (2019)](https://www.aclweb.org/anthology/N19-3002.pdf) explore measuring gender bias and debiasing techniques in the CNN / Dailymail dataset, the Penn Treebank, and WikiText-2. They find the CNN / Dailymail dataset to have a slightly lower gender bias based on their metric compared to the other datasets, but still show evidence of gender bias when looking at words such as 'fragile'. Because the articles were written by and for people in the US and the UK, they will likely present specifically US and UK perspectives and feature events that are considered relevant to those populations during the time that the articles were published. ### Other Known Limitations News articles have been shown to conform to writing conventions in which important information is primarily presented in the first third of the article [(Kryściński et al, 2019)](https://www.aclweb.org/anthology/D19-1051.pdf). [Chen et al (2016)](https://www.aclweb.org/anthology/P16-1223.pdf) conducted a manual study of 100 random instances of the first version of the dataset and found 25% of the samples to be difficult even for humans to answer correctly due to ambiguity and coreference errors. It should also be noted that machine-generated summarizations, even when extractive, may differ in truth values when compared to the original articles. ## Additional Information ### Dataset Curators The data was originally collected by Karl Moritz Hermann, Tomáš Kočiský, Edward Grefenstette, Lasse Espeholt, Will Kay, Mustafa Suleyman, and Phil Blunsom of Google DeepMind. Tomáš Kočiský and Phil Blunsom are also affiliated with the University of Oxford. They released scripts to collect and process the data into the question answering format. Ramesh Nallapati, Bowen Zhou, Cicero dos Santos, and Bing Xiang of IMB Watson and Çağlar Gu̇lçehre of Université de Montréal modified Hermann et al's collection scripts to restore the data to a summary format. They also produced both anonymized and non-anonymized versions. The code for the non-anonymized version is made publicly available by Abigail See of Stanford University, Peter J. Liu of Google Brain and Christopher D. Manning of Stanford University at <https://github.com/abisee/cnn-dailymail>. The work at Stanford University was supported by the DARPA DEFT ProgramAFRL contract no. FA8750-13-2-0040. ### Licensing Information The CNN / Daily Mail dataset version 1.0.0 is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0). ### Citation Information ``` @inproceedings{see-etal-2017-get, title = "Get To The Point: Summarization with Pointer-Generator Networks", author = "See, Abigail and Liu, Peter J. and Manning, Christopher D.", booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)", month = jul, year = "2017", address = "Vancouver, Canada", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/P17-1099", doi = "10.18653/v1/P17-1099", pages = "1073--1083", abstract = "Neural sequence-to-sequence models have provided a viable new approach for abstractive text summarization (meaning they are not restricted to simply selecting and rearranging passages from the original text). However, these models have two shortcomings: they are liable to reproduce factual details inaccurately, and they tend to repeat themselves. In this work we propose a novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways. First, we use a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator. Second, we use coverage to keep track of what has been summarized, which discourages repetition. We apply our model to the CNN / Daily Mail summarization task, outperforming the current abstractive state-of-the-art by at least 2 ROUGE points.", } ``` ``` @inproceedings{DBLP:conf/nips/HermannKGEKSB15, author={Karl Moritz Hermann and Tomás Kociský and Edward Grefenstette and Lasse Espeholt and Will Kay and Mustafa Suleyman and Phil Blunsom}, title={Teaching Machines to Read and Comprehend}, year={2015}, cdate={1420070400000}, pages={1693-1701}, url={http://papers.nips.cc/paper/5945-teaching-machines-to-read-and-comprehend}, booktitle={NIPS}, crossref={conf/nips/2015} } ``` ### Contributions Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@jplu](https://github.com/jplu), [@jbragg](https://github.com/jbragg), [@patrickvonplaten](https://github.com/patrickvonplaten) and [@mcmillanmajora](https://github.com/mcmillanmajora) for adding the English version of this dataset. The dataset was translated on Cloud TPU compute generously provided by Google through the [TPU Research Cloud](https://sites.research.google/trc/).
akjindal53244/200k_removed_SNI
--- license: mit configs: - config_name: default data_files: - split: train path: train_dataset.json - split: test path: eval_dataset.json ---
afoland/chapterized_PG
--- license: apache-2.0 --- 467 Project Gutenberg books, mostly of older provenance (author died pre-1914), chapterized by Chapter Captor (https://arxiv.org/abs/2011.04163), then requiring the number of chapters is correct and the assigned numbers are sequential starting at 1. Each line is one chapter of a book. Keys are "chapter_number", "text", "title", and "metadata{"id"}"" The id is the Gutenberg book number. Title is often not present.
Lollitor/FSMarked
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: ID dtype: string - name: INPUT dtype: string splits: - name: train num_bytes: 17636085 num_examples: 16245 download_size: 261423 dataset_size: 17636085 --- # Dataset Card for "FSMarked" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/haruna_bluearchive
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of haruna/黒舘ハルナ/晴奈 (Blue Archive) This is the dataset of haruna/黒舘ハルナ/晴奈 (Blue Archive), containing 500 images and their tags. The core tags of this character are `grey_hair, red_eyes, braid, halo, long_hair, wings, breasts, single_wing, side_braid, large_breasts, demon_wings, tail, red_halo, black_wings, eyewear_on_head`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 1.10 GiB | [Download](https://huggingface.co/datasets/CyberHarem/haruna_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 500 | 902.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haruna_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1384 | 1.90 GiB | [Download](https://huggingface.co/datasets/CyberHarem/haruna_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/haruna_bluearchive', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, gym_shirt, gym_uniform, looking_at_viewer, official_alternate_costume, white_shirt, blush, jacket_on_shoulders, red_buruma, solo, track_jacket, smile, simple_background, closed_mouth, white_background, holding, short_sleeves | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blush, gym_shirt, gym_uniform, looking_at_viewer, official_alternate_costume, solo, white_shirt, candy, red_buruma, tongue_out, open_mouth, short_sleeves, jacket_on_shoulders, track_jacket, collarbone, sunglasses, white_hair, fake_wings, simple_background, smile, white_background | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1boy, 1girl, blush, hetero, nipples, penis, solo_focus, white_shirt, official_alternate_costume, pov, sex, shirt_lift, vaginal, girl_on_top, looking_at_viewer, navel, sweat, short_sleeves, bar_censor, gym_uniform, heart, squatting_cowgirl_position, buruma, cum_in_pussy, open_mouth, smile, spread_legs | | 3 | 10 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1boy, blush, hetero, white_shirt, 1girl, solo_focus, tongue_out, bar_censor, gym_uniform, official_alternate_costume, gym_shirt, open_mouth, looking_at_viewer, pov, buruma, erection, licking_penis, pubic_hair, short_sleeves | | 4 | 21 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, black_headwear, black_necktie, black_skirt, collared_shirt, looking_at_viewer, solo, white_shirt, long_sleeves, black_pantyhose, smile, blush, peaked_cap, white_background, black_jacket, jacket_on_shoulders, simple_background, bow, closed_mouth, frills | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_headwear, black_necktie, black_skirt, blush, collared_shirt, hat, long_sleeves, looking_at_viewer, solo, white_shirt, black_pantyhose, holding_food, smile, candy, chocolate, frills, black_jacket, heart, open_mouth | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, blue_necktie, blue_shirt, long_sleeves, policewoman, black_pantyhose, blush, collared_shirt, demon_tail, looking_at_viewer, pencil_skirt, solo, white_background, white_gloves, alternate_costume, blue_skirt, grin, simple_background, white_hair, black_belt, black_footwear, blue_headwear, full_body, hair_bun, heart, police_hat, shopping_bag, handcuffs, high_heels, holding | | 7 | 11 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, fur-trimmed_kimono, hair_flower, long_sleeves, looking_at_viewer, official_alternate_costume, solo, obi, single_hair_bun, smile, wide_sleeves, blue_flower, blush, floral_print, simple_background, white_background, fur_collar, purple_kimono, closed_mouth | | 8 | 11 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, cleavage, collarbone, official_alternate_costume, white_one-piece_swimsuit, bare_shoulders, blush, casual_one-piece_swimsuit, looking_at_viewer, smile, solo, closed_mouth, covered_navel, simple_background, holding, white_hair, white_background, demon_tail, food, single_hair_bun, thighs, water | | 9 | 12 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, fake_animal_ears, playboy_bunny, solo, strapless_leotard, alternate_costume, black_pantyhose, looking_at_viewer, rabbit_ears, bare_shoulders, detached_collar, simple_background, white_background, smile, black_leotard, bowtie, cleavage, wrist_cuffs, blush, covered_navel, fake_tail, highleg, open_mouth, rabbit_tail | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gym_shirt | gym_uniform | looking_at_viewer | official_alternate_costume | white_shirt | blush | jacket_on_shoulders | red_buruma | solo | track_jacket | smile | simple_background | closed_mouth | white_background | holding | short_sleeves | candy | tongue_out | open_mouth | collarbone | sunglasses | white_hair | fake_wings | 1boy | hetero | nipples | penis | solo_focus | pov | sex | shirt_lift | vaginal | girl_on_top | navel | sweat | bar_censor | heart | squatting_cowgirl_position | buruma | cum_in_pussy | spread_legs | erection | licking_penis | pubic_hair | black_headwear | black_necktie | black_skirt | collared_shirt | long_sleeves | black_pantyhose | peaked_cap | black_jacket | bow | frills | hat | holding_food | chocolate | blue_necktie | blue_shirt | policewoman | demon_tail | pencil_skirt | white_gloves | alternate_costume | blue_skirt | grin | black_belt | black_footwear | blue_headwear | full_body | hair_bun | police_hat | shopping_bag | handcuffs | high_heels | fur-trimmed_kimono | hair_flower | obi | single_hair_bun | wide_sleeves | blue_flower | floral_print | fur_collar | purple_kimono | cleavage | white_one-piece_swimsuit | bare_shoulders | casual_one-piece_swimsuit | covered_navel | food | thighs | water | fake_animal_ears | playboy_bunny | strapless_leotard | rabbit_ears | detached_collar | black_leotard | bowtie | wrist_cuffs | fake_tail | highleg | rabbit_tail | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:--------------|:--------------------|:-----------------------------|:--------------|:--------|:----------------------|:-------------|:-------|:---------------|:--------|:--------------------|:---------------|:-------------------|:----------|:----------------|:--------|:-------------|:-------------|:-------------|:-------------|:-------------|:-------------|:-------|:---------|:----------|:--------|:-------------|:------|:------|:-------------|:----------|:--------------|:--------|:--------|:-------------|:--------|:-----------------------------|:---------|:---------------|:--------------|:-----------|:----------------|:-------------|:-----------------|:----------------|:--------------|:-----------------|:---------------|:------------------|:-------------|:---------------|:------|:---------|:------|:---------------|:------------|:---------------|:-------------|:--------------|:-------------|:---------------|:---------------|:--------------------|:-------------|:-------|:-------------|:-----------------|:----------------|:------------|:-----------|:-------------|:---------------|:------------|:-------------|:---------------------|:--------------|:------|:------------------|:---------------|:--------------|:---------------|:-------------|:----------------|:-----------|:---------------------------|:-----------------|:----------------------------|:----------------|:-------|:---------|:--------|:-------------------|:----------------|:--------------------|:--------------|:------------------|:----------------|:---------|:--------------|:------------|:----------|:--------------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | X | X | X | | | | | X | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 10 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | X | | | | | | | | | | X | | X | X | | | | | X | X | | | X | X | | | | | | | X | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 21 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | X | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | | X | X | | | X | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | | | X | | | X | | | X | | X | X | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 11 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | X | X | | X | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 8 | 11 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | X | X | | X | | | X | | X | X | X | X | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | 9 | 12 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | X | | | X | | | X | | X | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X |
MichiganNLP/scalable_vlm_probing
--- license: cc-by-4.0 language: - en pretty_name: Scalable VLM Probing --- # Scalable VLM Probing This repository contains data that supports the codebase of the [Scalable VLM Probing project](https://github.com/MichiganNLP/Scalable-VLM-Probing). Note: the embeddings in this repository are currently unused and come from preliminary experiments.
JayalekshmiGopakumar/doclaynet_classlabel
--- dataset_info: features: - name: text dtype: string - name: label dtype: class_label: names: '0': financial_reports '1': government_tenders '2': laws_and_regulations '3': manuals '4': patents '5': scientific_articles splits: - name: train num_bytes: 1798548 num_examples: 691 - name: validation num_bytes: 166488 num_examples: 64 - name: test num_bytes: 124710 num_examples: 49 download_size: 1173005 dataset_size: 2089746 --- # Dataset Card for "doclaynet_classlabel" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)