id
stringlengths
2
115
lastModified
stringlengths
24
24
tags
list
author
stringlengths
2
42
description
stringlengths
0
6.67k
citation
stringlengths
0
10.7k
likes
int64
0
3.66k
downloads
int64
0
8.89M
created
timestamp[us]
card
stringlengths
11
977k
card_len
int64
11
977k
embeddings
list
QJHao/sd-conf
2023-10-08T08:02:53.000Z
[ "region:us" ]
QJHao
null
null
0
0
2023-10-08T08:00:50
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
tech-winning/health_insurance_test_set
2023-10-08T09:35:10.000Z
[ "region:us" ]
tech-winning
null
null
1
0
2023-10-08T08:23:21
(1)本数据集是医疗保险中文数据集。数据集采用单项选择的形式,可用于测试大语言模型在医疗保险领域的知识掌握程度。 (2)本数据集包含医保基础知识、医保监管方法和医保监管依据三大知识模块,覆盖医保政策文件、医保数据模型、医保监管规则定义和明细、诊疗项目目录、药品说明书、检查化验操作等多个细分领域的知识。 (3)本数据集基于医保知识库文本,运用Qwen-14B-chat自动构建。由于是大模型自动构建,部分测试数据可能存在错误和纰漏,但是整体质量较高,能在一定程度上反映测评大模型的知识掌握程度。
249
[ [ -0.027435302734375, -0.06683349609375, 0.02099609375, 0.0380859375, -0.035400390625, -0.0213470458984375, 0.00823974609375, -0.01806640625, 0.015380859375, 0.042205810546875, -0.023101806640625, -0.0406494140625, -0.0258636474609375, -0.0031795501708984375, ...
minh21/COVID-QA-Chunk-64-question-answering-biencoder-data-65_25_10
2023-10-08T08:41:24.000Z
[ "region:us" ]
minh21
null
null
0
0
2023-10-08T08:41:20
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: question dtype: string - name: answer dtype: string - name: context_chunks sequence: string - name: document_id dtype: int64 - name: id dtype: int64 splits: - name: train num_bytes: 48800727 num_examples: 1176 - name: validation num_bytes: 4517266 num_examples: 134 download_size: 13294538 dataset_size: 53317993 --- # Dataset Card for "COVID-QA-Chunk-64-question-answering-biencoder-data-65_25_10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
761
[ [ -0.042205810546875, -0.0277099609375, 0.005031585693359375, 0.01532745361328125, -0.0187530517578125, -0.00457000732421875, 0.036651611328125, -0.00867462158203125, 0.046722412109375, 0.018280029296875, -0.04876708984375, -0.038543701171875, -0.033599853515625, ...
Aweminus/ReForm-Eval-Data
2023-10-18T17:28:18.000Z
[ "region:us" ]
Aweminus
null
null
0
0
2023-10-08T09:20:45
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
openskyml/models
2023-10-08T10:32:59.000Z
[ "language:en", "code", "region:us" ]
openskyml
null
null
0
0
2023-10-08T10:22:16
--- language: - en tags: - code --- # Models ## GPTs: [• Pigeon-TextGen](https://huggingface.co/openskyml/pigeon-textgen) [• GPT-2](https://huggingface.co/gpt2) ## Chats: [• Falcon-180B-chat](https://huggingface.co/tiiuae/falcon-180B-chat) [• LLaMA-13B-Chat-GGUF](https://huggingface.co/TheBloke/Llama-2-13B-chat-GGUF) ## Diffusions: [• SD-1.5](https://huggingface.co/runwayml/stable-diffusion-v1-5) [• Dall●E-mini](https://huggingface.co/dalle-mini/dalle-mini)
469
[ [ -0.046173095703125, -0.038116455078125, 0.0181427001953125, 0.040618896484375, -0.0288238525390625, 0.0146942138671875, 0.02880859375, -0.0252532958984375, 0.046356201171875, 0.01242828369140625, -0.06414794921875, -0.041656494140625, -0.060150146484375, 0.0...
openskyml/wikipedia
2023-10-08T10:37:06.000Z
[ "task_categories:text-generation", "task_categories:fill-mask", "task_ids:language-modeling", "task_ids:masked-language-modeling", "annotations_creators:no-annotation", "language_creators:crowdsourced", "multilinguality:multilingual", "size_categories:n<1K", "size_categories:1K<n<10K", "size_categ...
openskyml
Wikipedia dataset containing cleaned articles of all languages. The datasets are built from the Wikipedia dump (https://dumps.wikimedia.org/) with one split per language. Each example contains the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.).
@ONLINE {wikidump, author = {Wikimedia Foundation}, title = {Wikimedia Downloads}, url = {https://dumps.wikimedia.org} }
1
0
2023-10-08T10:34:49
--- annotations_creators: - no-annotation language_creators: - crowdsourced pretty_name: Wikipedia paperswithcode_id: null license: - cc-by-sa-3.0 - gfdl task_categories: - text-generation - fill-mask task_ids: - language-modeling - masked-language-modeling source_datasets: - original multilinguality: - multilingual size_categories: - n<1K - 1K<n<10K - 10K<n<100K - 100K<n<1M - 1M<n<10M language: - aa - ab - ace - af - ak - als - am - an - ang - ar - arc - arz - as - ast - atj - av - ay - az - azb - ba - bar - bcl - be - bg - bh - bi - bjn - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk - cdo - ce - ceb - ch - cho - chr - chy - ckb - co - cr - crh - cs - csb - cu - cv - cy - da - de - din - diq - dsb - dty - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fj - fo - fr - frp - frr - fur - fy - ga - gag - gan - gd - gl - glk - gn - gom - gor - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - ia - id - ie - ig - ii - ik - ilo - inh - io - is - it - iu - ja - jam - jbo - jv - ka - kaa - kab - kbd - kbp - kg - ki - kj - kk - kl - km - kn - ko - koi - krc - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lez - lfn - lg - li - lij - lmo - ln - lo - lrc - lt - ltg - lv - lzh - mai - mdf - mg - mh - mhr - mi - min - mk - ml - mn - mr - mrj - ms - mt - mus - mwl - my - myv - mzn - na - nah - nan - nap - nds - ne - new - ng - nl - nn - 'no' - nov - nrf - nso - nv - ny - oc - olo - om - or - os - pa - pag - pam - pap - pcd - pdc - pfl - pi - pih - pl - pms - pnb - pnt - ps - pt - qu - rm - rmy - rn - ro - ru - rue - rup - rw - sa - sah - sat - sc - scn - sco - sd - se - sg - sgs - sh - si - sk - sl - sm - sn - so - sq - sr - srn - ss - st - stq - su - sv - sw - szl - ta - tcy - tdt - te - tg - th - ti - tk - tl - tn - to - tpi - tr - ts - tt - tum - tw - ty - tyv - udm - ug - uk - ur - uz - ve - vec - vep - vi - vls - vo - vro - wa - war - wo - wuu - xal - xh - xmf - yi - yo - yue - za - zea - zh - zu language_bcp47: - nds-nl dataset_info: - config_name: 20220301.de features: - name: id dtype: string - name: url dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 8905282792 num_examples: 2665357 download_size: 6523215105 dataset_size: 8905282792 - config_name: 20220301.en features: - name: id dtype: string - name: url dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 20275516160 num_examples: 6458670 download_size: 20598313936 dataset_size: 20275516160 - config_name: 20220301.fr features: - name: id dtype: string - name: url dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 7375920768 num_examples: 2402095 download_size: 5602565274 dataset_size: 7375920768 - config_name: 20220301.frr features: - name: id dtype: string - name: url dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 9129760 num_examples: 15199 download_size: 12438017 dataset_size: 9129760 - config_name: 20220301.it features: - name: id dtype: string - name: url dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 4539944448 num_examples: 1743035 download_size: 3516441239 dataset_size: 4539944448 - config_name: 20220301.simple features: - name: id dtype: string - name: url dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 235072360 num_examples: 205328 download_size: 239682796 dataset_size: 235072360 config_names: - 20220301.aa - 20220301.ab - 20220301.ace - 20220301.ady - 20220301.af - 20220301.ak - 20220301.als - 20220301.am - 20220301.an - 20220301.ang - 20220301.ar - 20220301.arc - 20220301.arz - 20220301.as - 20220301.ast - 20220301.atj - 20220301.av - 20220301.ay - 20220301.az - 20220301.azb - 20220301.ba - 20220301.bar - 20220301.bat-smg - 20220301.bcl - 20220301.be - 20220301.be-x-old - 20220301.bg - 20220301.bh - 20220301.bi - 20220301.bjn - 20220301.bm - 20220301.bn - 20220301.bo - 20220301.bpy - 20220301.br - 20220301.bs - 20220301.bug - 20220301.bxr - 20220301.ca - 20220301.cbk-zam - 20220301.cdo - 20220301.ce - 20220301.ceb - 20220301.ch - 20220301.cho - 20220301.chr - 20220301.chy - 20220301.ckb - 20220301.co - 20220301.cr - 20220301.crh - 20220301.cs - 20220301.csb - 20220301.cu - 20220301.cv - 20220301.cy - 20220301.da - 20220301.de - 20220301.din - 20220301.diq - 20220301.dsb - 20220301.dty - 20220301.dv - 20220301.dz - 20220301.ee - 20220301.el - 20220301.eml - 20220301.en - 20220301.eo - 20220301.es - 20220301.et - 20220301.eu - 20220301.ext - 20220301.fa - 20220301.ff - 20220301.fi - 20220301.fiu-vro - 20220301.fj - 20220301.fo - 20220301.fr - 20220301.frp - 20220301.frr - 20220301.fur - 20220301.fy - 20220301.ga - 20220301.gag - 20220301.gan - 20220301.gd - 20220301.gl - 20220301.glk - 20220301.gn - 20220301.gom - 20220301.gor - 20220301.got - 20220301.gu - 20220301.gv - 20220301.ha - 20220301.hak - 20220301.haw - 20220301.he - 20220301.hi - 20220301.hif - 20220301.ho - 20220301.hr - 20220301.hsb - 20220301.ht - 20220301.hu - 20220301.hy - 20220301.ia - 20220301.id - 20220301.ie - 20220301.ig - 20220301.ii - 20220301.ik - 20220301.ilo - 20220301.inh - 20220301.io - 20220301.is - 20220301.it - 20220301.iu - 20220301.ja - 20220301.jam - 20220301.jbo - 20220301.jv - 20220301.ka - 20220301.kaa - 20220301.kab - 20220301.kbd - 20220301.kbp - 20220301.kg - 20220301.ki - 20220301.kj - 20220301.kk - 20220301.kl - 20220301.km - 20220301.kn - 20220301.ko - 20220301.koi - 20220301.krc - 20220301.ks - 20220301.ksh - 20220301.ku - 20220301.kv - 20220301.kw - 20220301.ky - 20220301.la - 20220301.lad - 20220301.lb - 20220301.lbe - 20220301.lez - 20220301.lfn - 20220301.lg - 20220301.li - 20220301.lij - 20220301.lmo - 20220301.ln - 20220301.lo - 20220301.lrc - 20220301.lt - 20220301.ltg - 20220301.lv - 20220301.mai - 20220301.map-bms - 20220301.mdf - 20220301.mg - 20220301.mh - 20220301.mhr - 20220301.mi - 20220301.min - 20220301.mk - 20220301.ml - 20220301.mn - 20220301.mr - 20220301.mrj - 20220301.ms - 20220301.mt - 20220301.mus - 20220301.mwl - 20220301.my - 20220301.myv - 20220301.mzn - 20220301.na - 20220301.nah - 20220301.nap - 20220301.nds - 20220301.nds-nl - 20220301.ne - 20220301.new - 20220301.ng - 20220301.nl - 20220301.nn - 20220301.no - 20220301.nov - 20220301.nrm - 20220301.nso - 20220301.nv - 20220301.ny - 20220301.oc - 20220301.olo - 20220301.om - 20220301.or - 20220301.os - 20220301.pa - 20220301.pag - 20220301.pam - 20220301.pap - 20220301.pcd - 20220301.pdc - 20220301.pfl - 20220301.pi - 20220301.pih - 20220301.pl - 20220301.pms - 20220301.pnb - 20220301.pnt - 20220301.ps - 20220301.pt - 20220301.qu - 20220301.rm - 20220301.rmy - 20220301.rn - 20220301.ro - 20220301.roa-rup - 20220301.roa-tara - 20220301.ru - 20220301.rue - 20220301.rw - 20220301.sa - 20220301.sah - 20220301.sat - 20220301.sc - 20220301.scn - 20220301.sco - 20220301.sd - 20220301.se - 20220301.sg - 20220301.sh - 20220301.si - 20220301.simple - 20220301.sk - 20220301.sl - 20220301.sm - 20220301.sn - 20220301.so - 20220301.sq - 20220301.sr - 20220301.srn - 20220301.ss - 20220301.st - 20220301.stq - 20220301.su - 20220301.sv - 20220301.sw - 20220301.szl - 20220301.ta - 20220301.tcy - 20220301.te - 20220301.tet - 20220301.tg - 20220301.th - 20220301.ti - 20220301.tk - 20220301.tl - 20220301.tn - 20220301.to - 20220301.tpi - 20220301.tr - 20220301.ts - 20220301.tt - 20220301.tum - 20220301.tw - 20220301.ty - 20220301.tyv - 20220301.udm - 20220301.ug - 20220301.uk - 20220301.ur - 20220301.uz - 20220301.ve - 20220301.vec - 20220301.vep - 20220301.vi - 20220301.vls - 20220301.vo - 20220301.wa - 20220301.war - 20220301.wo - 20220301.wuu - 20220301.xal - 20220301.xh - 20220301.xmf - 20220301.yi - 20220301.yo - 20220301.za - 20220301.zea - 20220301.zh - 20220301.zh-classical - 20220301.zh-min-nan - 20220301.zh-yue - 20220301.zu --- # Dataset Card for Wikipedia ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org) - **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Dataset Summary Wikipedia dataset containing cleaned articles of all languages. The datasets are built from the Wikipedia dump (https://dumps.wikimedia.org/) with one split per language. Each example contains the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.). The articles are parsed using the ``mwparserfromhell`` tool. To load this dataset you need to install Apache Beam and ``mwparserfromhell`` first: ``` pip install apache_beam mwparserfromhell ``` Then, you can load any subset of Wikipedia per language and per date this way: ```python from datasets import load_dataset load_dataset("wikipedia", language="sw", date="20220120", beam_runner=...) ``` where you can pass as `beam_runner` any Apache Beam supported runner for (distributed) data processing (see [here](https://beam.apache.org/documentation/runners/capability-matrix/)). Pass "DirectRunner" to run it on your machine. You can find the full list of languages and dates [here](https://dumps.wikimedia.org/backup-index.html). Some subsets of Wikipedia have already been processed by HuggingFace, and you can load them just with: ```python from datasets import load_dataset load_dataset("wikipedia", "20220301.en") ``` The list of pre-processed subsets is: - "20220301.de" - "20220301.en" - "20220301.fr" - "20220301.frr" - "20220301.it" - "20220301.simple" ### Supported Tasks and Leaderboards The dataset is generally used for Language Modeling. ### Languages You can find the list of languages [here](https://meta.wikimedia.org/wiki/List_of_Wikipedias). ## Dataset Structure ### Data Instances An example looks as follows: ``` {'id': '1', 'url': 'https://simple.wikipedia.org/wiki/April', 'title': 'April', 'text': 'April is the fourth month...' } ``` Some subsets of Wikipedia have already been processed by HuggingFace, as you can see below: #### 20220301.de - **Size of downloaded dataset files:** 6.84 GB - **Size of the generated dataset:** 9.34 GB - **Total amount of disk used:** 16.18 GB #### 20220301.en - **Size of downloaded dataset files:** 21.60 GB - **Size of the generated dataset:** 21.26 GB - **Total amount of disk used:** 42.86 GB #### 20220301.fr - **Size of downloaded dataset files:** 5.87 GB - **Size of the generated dataset:** 7.73 GB - **Total amount of disk used:** 13.61 GB #### 20220301.frr - **Size of downloaded dataset files:** 13.04 MB - **Size of the generated dataset:** 9.57 MB - **Total amount of disk used:** 22.62 MB #### 20220301.it - **Size of downloaded dataset files:** 3.69 GB - **Size of the generated dataset:** 4.76 GB - **Total amount of disk used:** 8.45 GB #### 20220301.simple - **Size of downloaded dataset files:** 251.32 MB - **Size of the generated dataset:** 246.49 MB - **Total amount of disk used:** 497.82 MB ### Data Fields The data fields are the same among all configurations: - `id` (`str`): ID of the article. - `url` (`str`): URL of the article. - `title` (`str`): Title of the article. - `text` (`str`): Text content of the article. ### Data Splits Here are the number of examples for several configurations: | name | train | |-----------------|--------:| | 20220301.de | 2665357 | | 20220301.en | 6458670 | | 20220301.fr | 2402095 | | 20220301.frr | 15199 | | 20220301.it | 1743035 | | 20220301.simple | 205328 | ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information Most of Wikipedia's text and many of its images are co-licensed under the [Creative Commons Attribution-ShareAlike 3.0 Unported License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License) (CC BY-SA) and the [GNU Free Documentation License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_GNU_Free_Documentation_License) (GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts). Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes the text. ### Citation Information ``` @ONLINE{wikidump, author = "Wikimedia Foundation", title = "Wikimedia Downloads", url = "https://dumps.wikimedia.org" } ``` ### Contributions Thanks to [@lewtun](https://github.com/lewtun), [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf), [@lhoestq](https://github.com/lhoestq), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset.
16,258
[ [ -0.060699462890625, -0.0438232421875, 0.01358795166015625, 0.00971221923828125, -0.01273345947265625, -0.020660400390625, -0.027130126953125, -0.033447265625, 0.03802490234375, 0.0236663818359375, -0.0548095703125, -0.0614013671875, -0.033050537109375, 0.020...
mfmezger/de_test
2023-10-08T12:06:00.000Z
[ "region:us" ]
mfmezger
null
null
0
0
2023-10-08T12:05:12
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
maxzancanaro/autotrain-data-data-protection_194
2023-10-08T12:30:49.000Z
[ "task_categories:text-classification", "region:us" ]
maxzancanaro
null
null
0
0
2023-10-08T12:30:19
--- task_categories: - text-classification --- # AutoTrain Dataset for project: data-protection_194 ## Dataset Description This dataset has been automatically processed by AutoTrain for project data-protection_194. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "text": "grindr conserver\u00e0 i registri delle applicazioni in virt\u00f9 della riservatezza, in un ambiente controllato e sicuro, per sei (6) mesi dalla data di sottoscrizione", "target": 0 }, { "text": "riceve una licenza revocabile, non- esclusiva, non-cedibile, limitata e personale per l'accesso e la scelta dei diritti che ea rende espressamente disponibili", "target": 1 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "text": "Value(dtype='string', id=None)", "target": "ClassLabel(names=['data protection', 'other'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 154 | | valid | 40 |
1,246
[ [ -0.027740478515625, 0.0003504753112792969, 0.0010509490966796875, 0.01617431640625, -0.013763427734375, 0.0159149169921875, -0.0054168701171875, -0.02276611328125, -0.00609588623046875, 0.0286407470703125, -0.039825439453125, -0.049774169921875, -0.0375366210937...
pixel-coping/c4_derived
2023-10-08T12:33:07.000Z
[ "region:us" ]
pixel-coping
null
null
0
0
2023-10-08T12:32:55
--- configs: - config_name: default data_files: - split: c4 path: data/c4-* - split: biomedical path: data/biomedical-* - split: counterfactual path: data/counterfactual-* - split: academic path: data/academic-* dataset_info: features: - name: text dtype: string - name: url dtype: string splits: - name: c4 num_bytes: 1820234 num_examples: 1000 - name: biomedical num_bytes: 1803036 num_examples: 989 - name: counterfactual num_bytes: 1813882 num_examples: 985 - name: academic num_bytes: 1199491 num_examples: 986 download_size: 4124290 dataset_size: 6636643 --- # Dataset Card for "c4_derived" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
815
[ [ -0.03790283203125, -0.00928497314453125, 0.016387939453125, 0.019744873046875, -0.015167236328125, 0.0221099853515625, 0.0208740234375, -0.0335693359375, 0.050140380859375, 0.0286712646484375, -0.06329345703125, -0.060638427734375, -0.038970947265625, -0.006...
tr416/catholic_4800_dataset_20231008_131846
2023-10-08T13:18:48.000Z
[ "region:us" ]
tr416
null
null
0
0
2023-10-08T13:18:47
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 splits: - name: train num_bytes: 760128.0 num_examples: 296 - name: test num_bytes: 7704.0 num_examples: 3 download_size: 52079 dataset_size: 767832.0 --- # Dataset Card for "catholic_4800_dataset_20231008_131846" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
610
[ [ -0.040008544921875, 0.00677490234375, 0.008270263671875, 0.045867919921875, -0.01494598388671875, -0.023284912109375, 0.02679443359375, 0.0002472400665283203, 0.048309326171875, 0.047637939453125, -0.0618896484375, -0.045257568359375, -0.045318603515625, 0.0...
moayyad-16/potato_and_weeds-detection_dataset
2023-10-08T14:00:34.000Z
[ "region:us" ]
moayyad-16
null
null
0
0
2023-10-08T13:31:38
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
akshaysaju9660/llamav2_9660
2023-10-08T13:36:51.000Z
[ "region:us" ]
akshaysaju9660
null
null
0
0
2023-10-08T13:32:01
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
oroikon/vistext_chart_captioning
2023-10-08T14:12:09.000Z
[ "region:us" ]
oroikon
null
null
0
0
2023-10-08T14:12:09
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
vietlegalqa/fewshot_tvpl_2023
2023-10-08T14:46:13.000Z
[ "region:us" ]
vietlegalqa
null
null
0
0
2023-10-08T14:40:24
--- dataset_info: features: - name: Index dtype: int64 - name: URL dtype: string - name: Q dtype: string - name: Doc dtype: string - name: MASKED Doc dtype: string - name: Ans dtype: string splits: - name: train num_bytes: 68105 num_examples: 10 download_size: 49074 dataset_size: 68105 --- # Dataset Card for "fewshot_tvpl_2023" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
516
[ [ -0.054931640625, -0.00710296630859375, 0.007061004638671875, 0.00933837890625, -0.0197906494140625, -0.00676727294921875, 0.028564453125, -0.0036163330078125, 0.035858154296875, 0.04327392578125, -0.06884765625, -0.037078857421875, -0.0419921875, -0.01434326...
open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B
2023-10-27T20:18:30.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T14:44:56
--- pretty_name: Evaluation run of PulsarAI/Chat-AYB-Nova-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PulsarAI/Chat-AYB-Nova-13B](https://huggingface.co/PulsarAI/Chat-AYB-Nova-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-27T20:18:17.450635](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B/blob/main/results_2023-10-27T20-18-17.450635.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0041946308724832215,\n\ \ \"em_stderr\": 0.0006618716168266419,\n \"f1\": 0.0802946728187919,\n\ \ \"f1_stderr\": 0.0016873252068220475,\n \"acc\": 0.44971346473405205,\n\ \ \"acc_stderr\": 0.010392725523775513\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0041946308724832215,\n \"em_stderr\": 0.0006618716168266419,\n\ \ \"f1\": 0.0802946728187919,\n \"f1_stderr\": 0.0016873252068220475\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \ \ \"acc_stderr\": 0.009065050306776921\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774104\n\ \ }\n}\n```" repo_url: https://huggingface.co/PulsarAI/Chat-AYB-Nova-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|arc:challenge|25_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T14-44-32.660445.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_27T20_18_17.450635 path: - '**/details_harness|drop|3_2023-10-27T20-18-17.450635.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-27T20-18-17.450635.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_27T20_18_17.450635 path: - '**/details_harness|gsm8k|5_2023-10-27T20-18-17.450635.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-27T20-18-17.450635.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hellaswag|10_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-44-32.660445.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-44-32.660445.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T14_44_32.660445 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-44-32.660445.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-44-32.660445.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_27T20_18_17.450635 path: - '**/details_harness|winogrande|5_2023-10-27T20-18-17.450635.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-27T20-18-17.450635.parquet' - config_name: results data_files: - split: 2023_10_08T14_44_32.660445 path: - results_2023-10-08T14-44-32.660445.parquet - split: 2023_10_27T20_18_17.450635 path: - results_2023-10-27T20-18-17.450635.parquet - split: latest path: - results_2023-10-27T20-18-17.450635.parquet --- # Dataset Card for Evaluation run of PulsarAI/Chat-AYB-Nova-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PulsarAI/Chat-AYB-Nova-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PulsarAI/Chat-AYB-Nova-13B](https://huggingface.co/PulsarAI/Chat-AYB-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T20:18:17.450635](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Nova-13B/blob/main/results_2023-10-27T20-18-17.450635.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0041946308724832215, "em_stderr": 0.0006618716168266419, "f1": 0.0802946728187919, "f1_stderr": 0.0016873252068220475, "acc": 0.44971346473405205, "acc_stderr": 0.010392725523775513 }, "harness|drop|3": { "em": 0.0041946308724832215, "em_stderr": 0.0006618716168266419, "f1": 0.0802946728187919, "f1_stderr": 0.0016873252068220475 }, "harness|gsm8k|5": { "acc": 0.12357846853677028, "acc_stderr": 0.009065050306776921 }, "harness|winogrande|5": { "acc": 0.7758484609313339, "acc_stderr": 0.011720400740774104 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,638
[ [ -0.025543212890625, -0.05413818359375, 0.0085601806640625, 0.0279083251953125, -0.00699615478515625, 0.0091400146484375, -0.0274658203125, -0.008148193359375, 0.03204345703125, 0.038238525390625, -0.051910400390625, -0.06396484375, -0.047698974609375, 0.0115...
open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B
2023-10-28T16:53:53.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T14:46:28
--- pretty_name: Evaluation run of PulsarAI/Chat-AYB-Platypus2-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PulsarAI/Chat-AYB-Platypus2-13B](https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T16:53:41.047162](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B/blob/main/results_2023-10-28T16-53-41.047162.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2752726510067114,\n\ \ \"em_stderr\": 0.0045741300617909856,\n \"f1\": 0.38116505872483314,\n\ \ \"f1_stderr\": 0.004403649120675284,\n \"acc\": 0.3936315988829403,\n\ \ \"acc_stderr\": 0.0083541228301978\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.2752726510067114,\n \"em_stderr\": 0.0045741300617909856,\n\ \ \"f1\": 0.38116505872483314,\n \"f1_stderr\": 0.004403649120675284\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.029567854435178165,\n \ \ \"acc_stderr\": 0.004665893134220814\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174785\n\ \ }\n}\n```" repo_url: https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|arc:challenge|25_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T14-46-05.202813.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T16_53_41.047162 path: - '**/details_harness|drop|3_2023-10-28T16-53-41.047162.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T16-53-41.047162.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T16_53_41.047162 path: - '**/details_harness|gsm8k|5_2023-10-28T16-53-41.047162.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T16-53-41.047162.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hellaswag|10_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-46-05.202813.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T14_46_05.202813 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-46-05.202813.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-46-05.202813.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T16_53_41.047162 path: - '**/details_harness|winogrande|5_2023-10-28T16-53-41.047162.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T16-53-41.047162.parquet' - config_name: results data_files: - split: 2023_10_08T14_46_05.202813 path: - results_2023-10-08T14-46-05.202813.parquet - split: 2023_10_28T16_53_41.047162 path: - results_2023-10-28T16-53-41.047162.parquet - split: latest path: - results_2023-10-28T16-53-41.047162.parquet --- # Dataset Card for Evaluation run of PulsarAI/Chat-AYB-Platypus2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PulsarAI/Chat-AYB-Platypus2-13B](https://huggingface.co/PulsarAI/Chat-AYB-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T16:53:41.047162](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Chat-AYB-Platypus2-13B/blob/main/results_2023-10-28T16-53-41.047162.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2752726510067114, "em_stderr": 0.0045741300617909856, "f1": 0.38116505872483314, "f1_stderr": 0.004403649120675284, "acc": 0.3936315988829403, "acc_stderr": 0.0083541228301978 }, "harness|drop|3": { "em": 0.2752726510067114, "em_stderr": 0.0045741300617909856, "f1": 0.38116505872483314, "f1_stderr": 0.004403649120675284 }, "harness|gsm8k|5": { "acc": 0.029567854435178165, "acc_stderr": 0.004665893134220814 }, "harness|winogrande|5": { "acc": 0.7576953433307024, "acc_stderr": 0.012042352526174785 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,682
[ [ -0.026123046875, -0.053863525390625, 0.0089111328125, 0.0287933349609375, -0.01091766357421875, 0.00952911376953125, -0.032928466796875, -0.00957489013671875, 0.03057861328125, 0.037811279296875, -0.053314208984375, -0.05853271484375, -0.04913330078125, 0.01...
open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B
2023-10-26T09:15:40.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T14:51:33
--- pretty_name: Evaluation run of PulsarAI/2x-LoRA-Assemble-Nova-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PulsarAI/2x-LoRA-Assemble-Nova-13B](https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Nova-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T09:15:27.308196](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B/blob/main/results_2023-10-26T09-15-27.308196.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005243288590604027,\n\ \ \"em_stderr\": 0.0007396052260778,\n \"f1\": 0.08796455536912774,\n\ \ \"f1_stderr\": 0.0018271669211415338,\n \"acc\": 0.4359422992113922,\n\ \ \"acc_stderr\": 0.010092491580522747\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.005243288590604027,\n \"em_stderr\": 0.0007396052260778,\n\ \ \"f1\": 0.08796455536912774,\n \"f1_stderr\": 0.0018271669211415338\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1023502653525398,\n \ \ \"acc_stderr\": 0.008349110996208824\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.01183587216483667\n\ \ }\n}\n```" repo_url: https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Nova-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|arc:challenge|25_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T14-51-09.823341.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T09_15_27.308196 path: - '**/details_harness|drop|3_2023-10-26T09-15-27.308196.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T09-15-27.308196.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T09_15_27.308196 path: - '**/details_harness|gsm8k|5_2023-10-26T09-15-27.308196.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T09-15-27.308196.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hellaswag|10_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-51-09.823341.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-51-09.823341.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T14_51_09.823341 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-51-09.823341.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-51-09.823341.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T09_15_27.308196 path: - '**/details_harness|winogrande|5_2023-10-26T09-15-27.308196.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T09-15-27.308196.parquet' - config_name: results data_files: - split: 2023_10_08T14_51_09.823341 path: - results_2023-10-08T14-51-09.823341.parquet - split: 2023_10_26T09_15_27.308196 path: - results_2023-10-26T09-15-27.308196.parquet - split: latest path: - results_2023-10-26T09-15-27.308196.parquet --- # Dataset Card for Evaluation run of PulsarAI/2x-LoRA-Assemble-Nova-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Nova-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PulsarAI/2x-LoRA-Assemble-Nova-13B](https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T09:15:27.308196](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Nova-13B/blob/main/results_2023-10-26T09-15-27.308196.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.005243288590604027, "em_stderr": 0.0007396052260778, "f1": 0.08796455536912774, "f1_stderr": 0.0018271669211415338, "acc": 0.4359422992113922, "acc_stderr": 0.010092491580522747 }, "harness|drop|3": { "em": 0.005243288590604027, "em_stderr": 0.0007396052260778, "f1": 0.08796455536912774, "f1_stderr": 0.0018271669211415338 }, "harness|gsm8k|5": { "acc": 0.1023502653525398, "acc_stderr": 0.008349110996208824 }, "harness|winogrande|5": { "acc": 0.7695343330702447, "acc_stderr": 0.01183587216483667 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,716
[ [ -0.0231781005859375, -0.051055908203125, 0.0135650634765625, 0.025726318359375, -0.00811767578125, 0.004608154296875, -0.0187225341796875, -0.01399993896484375, 0.03173828125, 0.041168212890625, -0.049072265625, -0.0640869140625, -0.049835205078125, 0.010261...
open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B
2023-10-26T04:49:23.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T14:58:57
--- pretty_name: Evaluation run of PulsarAI/2x-LoRA-Assemble-Platypus2-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PulsarAI/2x-LoRA-Assemble-Platypus2-13B](https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Platypus2-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T04:49:09.510505](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B/blob/main/results_2023-10-26T04-49-09.510505.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.14020553691275167,\n\ \ \"em_stderr\": 0.003555654511760366,\n \"f1\": 0.25958473154362444,\n\ \ \"f1_stderr\": 0.003697673494004961,\n \"acc\": 0.3790556094431875,\n\ \ \"acc_stderr\": 0.007400551365645916\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.14020553691275167,\n \"em_stderr\": 0.003555654511760366,\n\ \ \"f1\": 0.25958473154362444,\n \"f1_stderr\": 0.003697673494004961\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \ \ \"acc_stderr\": 0.002615326510775672\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516161\n\ \ }\n}\n```" repo_url: https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Platypus2-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|arc:challenge|25_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T14-58-33.553023.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T04_49_09.510505 path: - '**/details_harness|drop|3_2023-10-26T04-49-09.510505.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T04-49-09.510505.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T04_49_09.510505 path: - '**/details_harness|gsm8k|5_2023-10-26T04-49-09.510505.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T04-49-09.510505.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hellaswag|10_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-58-33.553023.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T14-58-33.553023.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T14_58_33.553023 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-58-33.553023.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T14-58-33.553023.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T04_49_09.510505 path: - '**/details_harness|winogrande|5_2023-10-26T04-49-09.510505.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T04-49-09.510505.parquet' - config_name: results data_files: - split: 2023_10_08T14_58_33.553023 path: - results_2023-10-08T14-58-33.553023.parquet - split: 2023_10_26T04_49_09.510505 path: - results_2023-10-26T04-49-09.510505.parquet - split: latest path: - results_2023-10-26T04-49-09.510505.parquet --- # Dataset Card for Evaluation run of PulsarAI/2x-LoRA-Assemble-Platypus2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Platypus2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PulsarAI/2x-LoRA-Assemble-Platypus2-13B](https://huggingface.co/PulsarAI/2x-LoRA-Assemble-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T04:49:09.510505](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__2x-LoRA-Assemble-Platypus2-13B/blob/main/results_2023-10-26T04-49-09.510505.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.14020553691275167, "em_stderr": 0.003555654511760366, "f1": 0.25958473154362444, "f1_stderr": 0.003697673494004961, "acc": 0.3790556094431875, "acc_stderr": 0.007400551365645916 }, "harness|drop|3": { "em": 0.14020553691275167, "em_stderr": 0.003555654511760366, "f1": 0.25958473154362444, "f1_stderr": 0.003697673494004961 }, "harness|gsm8k|5": { "acc": 0.009097801364670205, "acc_stderr": 0.002615326510775672 }, "harness|winogrande|5": { "acc": 0.7490134175217048, "acc_stderr": 0.012185776220516161 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,782
[ [ -0.0255126953125, -0.050933837890625, 0.014495849609375, 0.0255584716796875, -0.01453399658203125, 0.004901885986328125, -0.0258331298828125, -0.01441192626953125, 0.030364990234375, 0.04046630859375, -0.05084228515625, -0.058837890625, -0.04998779296875, 0....
Hxfi/RAVI-AYANOLI
2023-10-08T15:01:42.000Z
[ "region:us" ]
Hxfi
null
null
0
0
2023-10-08T15:01:42
Entry not found
15
[ [ -0.0213775634765625, -0.014984130859375, 0.05718994140625, 0.0288543701171875, -0.0350341796875, 0.046478271484375, 0.052520751953125, 0.005062103271484375, 0.051361083984375, 0.016998291015625, -0.0521240234375, -0.01496124267578125, -0.0604248046875, 0.037...
open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B
2023-10-29T14:59:11.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T15:05:43
--- pretty_name: Evaluation run of PulsarAI/GenAI-Nova-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PulsarAI/GenAI-Nova-13B](https://huggingface.co/PulsarAI/GenAI-Nova-13B) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-29T14:58:59.300779](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B/blob/main/results_2023-10-29T14-58-59.300779.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10769714765100671,\n\ \ \"em_stderr\": 0.003174664916131534,\n \"f1\": 0.18815016778523358,\n\ \ \"f1_stderr\": 0.0033317211011039192,\n \"acc\": 0.4254059872915611,\n\ \ \"acc_stderr\": 0.009560931288960338\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.10769714765100671,\n \"em_stderr\": 0.003174664916131534,\n\ \ \"f1\": 0.18815016778523358,\n \"f1_stderr\": 0.0033317211011039192\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \ \ \"acc_stderr\": 0.007357713523222347\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.01176414905469833\n\ \ }\n}\n```" repo_url: https://huggingface.co/PulsarAI/GenAI-Nova-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|arc:challenge|25_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T15-05-19.512883.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_29T14_58_59.300779 path: - '**/details_harness|drop|3_2023-10-29T14-58-59.300779.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-29T14-58-59.300779.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_29T14_58_59.300779 path: - '**/details_harness|gsm8k|5_2023-10-29T14-58-59.300779.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-29T14-58-59.300779.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hellaswag|10_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T15-05-19.512883.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T15-05-19.512883.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T15_05_19.512883 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T15-05-19.512883.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T15-05-19.512883.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_29T14_58_59.300779 path: - '**/details_harness|winogrande|5_2023-10-29T14-58-59.300779.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-29T14-58-59.300779.parquet' - config_name: results data_files: - split: 2023_10_08T15_05_19.512883 path: - results_2023-10-08T15-05-19.512883.parquet - split: 2023_10_29T14_58_59.300779 path: - results_2023-10-29T14-58-59.300779.parquet - split: latest path: - results_2023-10-29T14-58-59.300779.parquet --- # Dataset Card for Evaluation run of PulsarAI/GenAI-Nova-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PulsarAI/GenAI-Nova-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PulsarAI/GenAI-Nova-13B](https://huggingface.co/PulsarAI/GenAI-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T14:58:59.300779](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__GenAI-Nova-13B/blob/main/results_2023-10-29T14-58-59.300779.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.10769714765100671, "em_stderr": 0.003174664916131534, "f1": 0.18815016778523358, "f1_stderr": 0.0033317211011039192, "acc": 0.4254059872915611, "acc_stderr": 0.009560931288960338 }, "harness|drop|3": { "em": 0.10769714765100671, "em_stderr": 0.003174664916131534, "f1": 0.18815016778523358, "f1_stderr": 0.0033317211011039192 }, "harness|gsm8k|5": { "acc": 0.07733131159969674, "acc_stderr": 0.007357713523222347 }, "harness|winogrande|5": { "acc": 0.7734806629834254, "acc_stderr": 0.01176414905469833 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,590
[ [ -0.0250244140625, -0.0447998046875, 0.01235198974609375, 0.02545166015625, -0.010040283203125, 0.006473541259765625, -0.0244598388671875, -0.004062652587890625, 0.0272979736328125, 0.0369873046875, -0.04742431640625, -0.06500244140625, -0.048614501953125, 0....
johannes-garstenauer/embeddings_from_distilbert_masking_heaps_and_eval_part0_test
2023-10-08T15:31:50.000Z
[ "region:us" ]
johannes-garstenauer
null
null
0
0
2023-10-08T15:31:41
--- dataset_info: features: - name: struct dtype: string - name: label dtype: int64 - name: pred dtype: int64 - name: cls_layer_6 sequence: float32 - name: cls_layer_5 sequence: float32 - name: cls_layer_4 sequence: float32 splits: - name: train num_bytes: 13428556 num_examples: 1408 download_size: 16660183 dataset_size: 13428556 --- # Dataset Card for "embeddings_from_distilbert_masking_heaps_and_eval_part0_test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
604
[ [ -0.039031982421875, -0.04248046875, 0.01354217529296875, 0.03045654296875, -0.01317596435546875, 0.01103973388671875, 0.032867431640625, 0.0096588134765625, 0.057769775390625, 0.01873779296875, -0.04107666015625, -0.059051513671875, -0.051055908203125, -0.01...
johannes-garstenauer/embeddings_from_distilbert_masking_heaps_and_eval_part1_test
2023-10-08T15:32:10.000Z
[ "region:us" ]
johannes-garstenauer
null
null
0
0
2023-10-08T15:32:01
--- dataset_info: features: - name: struct dtype: string - name: label dtype: int64 - name: pred dtype: int64 - name: cls_layer_6 sequence: float32 - name: cls_layer_5 sequence: float32 - name: cls_layer_4 sequence: float32 splits: - name: train num_bytes: 12230881 num_examples: 1283 download_size: 14962458 dataset_size: 12230881 --- # Dataset Card for "embeddings_from_distilbert_masking_heaps_and_eval_part1_test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
604
[ [ -0.04010009765625, -0.048187255859375, 0.0114288330078125, 0.0318603515625, -0.015838623046875, 0.00771331787109375, 0.035369873046875, 0.0117645263671875, 0.058685302734375, 0.0204315185546875, -0.04498291015625, -0.059844970703125, -0.05267333984375, -0.02...
BounharAbdelaziz/Face-Aging-Dataset
2023-10-08T16:16:52.000Z
[ "region:us" ]
BounharAbdelaziz
null
null
0
0
2023-10-08T15:52:09
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': age_domain_20_35 '1': age_domain_36_60 '2': age_domain_60_90 splits: - name: train num_bytes: 16136815235.988 num_examples: 40252 download_size: 16202626214 dataset_size: 16136815235.988 --- # Dataset Card for "Face-Aging-Dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
624
[ [ -0.02569580078125, -0.023712158203125, 0.01531219482421875, 0.028533935546875, -0.01525115966796875, 0.00885772705078125, 0.0312042236328125, -0.01947021484375, 0.051300048828125, 0.03533935546875, -0.072998046875, -0.0426025390625, -0.0263824462890625, -0.0...
BounharAbdelaziz/Face-Gender-Swap
2023-10-08T16:25:03.000Z
[ "region:us" ]
BounharAbdelaziz
null
null
0
0
2023-10-08T15:53:22
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': domain_F '1': domain_M splits: - name: train num_bytes: 20710300480.468 num_examples: 51604 download_size: 20737281406 dataset_size: 20710300480.468 --- # Dataset Card for "Face-Gender-Swap" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
574
[ [ -0.042724609375, -0.01026153564453125, 0.00954437255859375, 0.0269927978515625, 0.0005345344543457031, 0.0095062255859375, 0.03558349609375, -0.0196990966796875, 0.04974365234375, 0.032012939453125, -0.07574462890625, -0.035400390625, -0.0433349609375, -0.01...
osanseviero/ag_misclassifications
2023-10-08T15:57:20.000Z
[ "region:us" ]
osanseviero
null
null
0
0
2023-10-08T15:54:11
This dataset contains a slice of 200 samples from the [AG News](https://huggingface.co/datasets/ag_news) dataset (test split). The picked 200 samples are potential misclassifications of the original test data. Approach * Fine-tune DistilBERT with 10k samples from the training data (out of 120k) * Do a forward pass with the model, storing the loss * Sort the samples based on the loss This is a repository for demonstration purposes
437
[ [ -0.04486083984375, -0.05133056640625, 0.019073486328125, -0.01183319091796875, -0.00826263427734375, 0.0255889892578125, 0.036529541015625, -0.0030364990234375, 0.01422882080078125, 0.036529541015625, -0.0595703125, -0.01611328125, -0.044769287109375, -0.005...
BangumiBase/rurounikenshin2023
2023-10-08T18:47:36.000Z
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
BangumiBase
null
null
0
0
2023-10-08T16:05:16
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Rurouni Kenshin (2023) This is the image base of bangumi Rurouni Kenshin (2023), we detected 27 characters, 3015 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 363 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 376 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 30 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 58 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 209 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 81 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 17 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 86 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 122 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 338 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 34 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 43 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 27 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 27 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 67 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 49 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 7 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | N/A | | 17 | 189 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 21 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 16 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 688 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 13 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 14 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 31 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 33 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 9 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | noise | 67 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
9,703
[ [ -0.042816162109375, -0.00907135009765625, 0.00951385498046875, 0.01342010498046875, -0.01544189453125, -0.005458831787109375, -0.002048492431640625, -0.0240325927734375, 0.039581298828125, 0.031036376953125, -0.060150146484375, -0.053680419921875, -0.04129028320...
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit
2023-10-28T09:09:12.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T16:13:43
--- pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T09:09:00.701109](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit/blob/main/results_2023-10-28T09-09-00.701109.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004928691275167785,\n\ \ \"em_stderr\": 0.0007171872517059817,\n \"f1\": 0.06610213926174507,\n\ \ \"f1_stderr\": 0.001553905671666344,\n \"acc\": 0.4026417372707673,\n\ \ \"acc_stderr\": 0.009752392640502771\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.004928691275167785,\n \"em_stderr\": 0.0007171872517059817,\n\ \ \"f1\": 0.06610213926174507,\n \"f1_stderr\": 0.001553905671666344\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \ \ \"acc_stderr\": 0.007086462127954495\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.734017363851618,\n \"acc_stderr\": 0.012418323153051046\n\ \ }\n}\n```" repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|arc:challenge|25_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T16-13-20.175189.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T09_09_00.701109 path: - '**/details_harness|drop|3_2023-10-28T09-09-00.701109.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T09-09-00.701109.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T09_09_00.701109 path: - '**/details_harness|gsm8k|5_2023-10-28T09-09-00.701109.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T09-09-00.701109.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hellaswag|10_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T16-13-20.175189.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T16-13-20.175189.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T16_13_20.175189 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T16-13-20.175189.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T16-13-20.175189.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T09_09_00.701109 path: - '**/details_harness|winogrande|5_2023-10-28T09-09-00.701109.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T09-09-00.701109.parquet' - config_name: results data_files: - split: 2023_10_08T16_13_20.175189 path: - results_2023-10-08T16-13-20.175189.parquet - split: 2023_10_28T09_09_00.701109 path: - results_2023-10-28T09-09-00.701109.parquet - split: latest path: - results_2023-10-28T09-09-00.701109.parquet --- # Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-mixed-datasets-time-unit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T09:09:00.701109](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-mixed-datasets-time-unit/blob/main/results_2023-10-28T09-09-00.701109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.004928691275167785, "em_stderr": 0.0007171872517059817, "f1": 0.06610213926174507, "f1_stderr": 0.001553905671666344, "acc": 0.4026417372707673, "acc_stderr": 0.009752392640502771 }, "harness|drop|3": { "em": 0.004928691275167785, "em_stderr": 0.0007171872517059817, "f1": 0.06610213926174507, "f1_stderr": 0.001553905671666344 }, "harness|gsm8k|5": { "acc": 0.0712661106899166, "acc_stderr": 0.007086462127954495 }, "harness|winogrande|5": { "acc": 0.734017363851618, "acc_stderr": 0.012418323153051046 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,976
[ [ -0.0290069580078125, -0.055450439453125, 0.0147552490234375, 0.0211334228515625, -0.0183868408203125, 0.0070953369140625, -0.020355224609375, -0.020050048828125, 0.03912353515625, 0.04248046875, -0.047515869140625, -0.0662841796875, -0.042449951171875, 0.008...
mfmezger/deu-medical_meadow_mmmlu
2023-10-08T16:35:05.000Z
[ "region:us" ]
mfmezger
null
null
0
0
2023-10-08T16:34:49
Entry not found
15
[ [ -0.021392822265625, -0.01494598388671875, 0.05718994140625, 0.028839111328125, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.01702880859375, -0.052093505859375, -0.01494598388671875, -0.06036376953125, 0.03790...
mfmezger/deu-medical_meadow_pubmed_causal
2023-10-08T16:35:30.000Z
[ "region:us" ]
mfmezger
null
null
0
0
2023-10-08T16:35:22
Entry not found
15
[ [ -0.021392822265625, -0.01494598388671875, 0.05718994140625, 0.028839111328125, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.01702880859375, -0.052093505859375, -0.01494598388671875, -0.06036376953125, 0.03790...
open-llm-leaderboard/details_LeoLM__leo-hessianai-7b
2023-10-25T10:03:36.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T17:16:38
--- pretty_name: Evaluation run of LeoLM/leo-hessianai-7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [LeoLM/leo-hessianai-7b](https://huggingface.co/LeoLM/leo-hessianai-7b) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeoLM__leo-hessianai-7b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T10:03:23.884304](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-7b/blob/main/results_2023-10-25T10-03-23.884304.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\ \ \"em_stderr\": 0.0003144653119413205,\n \"f1\": 0.056075922818791854,\n\ \ \"f1_stderr\": 0.0013232326016856207,\n \"acc\": 0.38874610827245293,\n\ \ \"acc_stderr\": 0.009469282540407879\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413205,\n\ \ \"f1\": 0.056075922818791854,\n \"f1_stderr\": 0.0013232326016856207\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.056103108415466264,\n \ \ \"acc_stderr\": 0.006338668431321877\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7213891081294396,\n \"acc_stderr\": 0.01259989664949388\n\ \ }\n}\n```" repo_url: https://huggingface.co/LeoLM/leo-hessianai-7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|arc:challenge|25_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T17-16-14.181420.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_25T10_03_23.884304 path: - '**/details_harness|drop|3_2023-10-25T10-03-23.884304.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T10-03-23.884304.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_25T10_03_23.884304 path: - '**/details_harness|gsm8k|5_2023-10-25T10-03-23.884304.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T10-03-23.884304.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hellaswag|10_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-16-14.181420.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-16-14.181420.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T17_16_14.181420 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T17-16-14.181420.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T17-16-14.181420.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_25T10_03_23.884304 path: - '**/details_harness|winogrande|5_2023-10-25T10-03-23.884304.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T10-03-23.884304.parquet' - config_name: results data_files: - split: 2023_10_08T17_16_14.181420 path: - results_2023-10-08T17-16-14.181420.parquet - split: 2023_10_25T10_03_23.884304 path: - results_2023-10-25T10-03-23.884304.parquet - split: latest path: - results_2023-10-25T10-03-23.884304.parquet --- # Dataset Card for Evaluation run of LeoLM/leo-hessianai-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/LeoLM/leo-hessianai-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [LeoLM/leo-hessianai-7b](https://huggingface.co/LeoLM/leo-hessianai-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LeoLM__leo-hessianai-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T10:03:23.884304](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-7b/blob/main/results_2023-10-25T10-03-23.884304.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0009437919463087249, "em_stderr": 0.0003144653119413205, "f1": 0.056075922818791854, "f1_stderr": 0.0013232326016856207, "acc": 0.38874610827245293, "acc_stderr": 0.009469282540407879 }, "harness|drop|3": { "em": 0.0009437919463087249, "em_stderr": 0.0003144653119413205, "f1": 0.056075922818791854, "f1_stderr": 0.0013232326016856207 }, "harness|gsm8k|5": { "acc": 0.056103108415466264, "acc_stderr": 0.006338668431321877 }, "harness|winogrande|5": { "acc": 0.7213891081294396, "acc_stderr": 0.01259989664949388 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,598
[ [ -0.03204345703125, -0.044036865234375, 0.00730133056640625, 0.0200653076171875, -0.01142120361328125, 0.005336761474609375, -0.02801513671875, -0.01885986328125, 0.0309295654296875, 0.035797119140625, -0.05120849609375, -0.0672607421875, -0.046630859375, 0.0...
0-hero/prompt-perfect
2023-10-08T17:27:15.000Z
[ "region:us" ]
0-hero
null
null
0
0
2023-10-08T17:27:15
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b
2023-10-28T16:43:36.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T17:27:39
--- pretty_name: Evaluation run of pankajmathur/orca_mini_v3_13b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [pankajmathur/orca_mini_v3_13b](https://huggingface.co/pankajmathur/orca_mini_v3_13b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T16:43:24.612769](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b/blob/main/results_2023-10-28T16-43-24.612769.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15383808724832215,\n\ \ \"em_stderr\": 0.0036948628598682874,\n \"f1\": 0.22225880872483197,\n\ \ \"f1_stderr\": 0.0037670501187578413,\n \"acc\": 0.44797935342421163,\n\ \ \"acc_stderr\": 0.010609253699619367\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.15383808724832215,\n \"em_stderr\": 0.0036948628598682874,\n\ \ \"f1\": 0.22225880872483197,\n \"f1_stderr\": 0.0037670501187578413\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13115996967399546,\n \ \ \"acc_stderr\": 0.00929849923558785\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650884\n\ \ }\n}\n```" repo_url: https://huggingface.co/pankajmathur/orca_mini_v3_13b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|arc:challenge|25_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T17-27-15.323068.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T16_43_24.612769 path: - '**/details_harness|drop|3_2023-10-28T16-43-24.612769.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T16-43-24.612769.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T16_43_24.612769 path: - '**/details_harness|gsm8k|5_2023-10-28T16-43-24.612769.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T16-43-24.612769.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hellaswag|10_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-27-15.323068.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-27-15.323068.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T17_27_15.323068 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T17-27-15.323068.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T17-27-15.323068.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T16_43_24.612769 path: - '**/details_harness|winogrande|5_2023-10-28T16-43-24.612769.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T16-43-24.612769.parquet' - config_name: results data_files: - split: 2023_10_08T17_27_15.323068 path: - results_2023-10-08T17-27-15.323068.parquet - split: 2023_10_28T16_43_24.612769 path: - results_2023-10-28T16-43-24.612769.parquet - split: latest path: - results_2023-10-28T16-43-24.612769.parquet --- # Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/pankajmathur/orca_mini_v3_13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [pankajmathur/orca_mini_v3_13b](https://huggingface.co/pankajmathur/orca_mini_v3_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T16:43:24.612769](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_13b/blob/main/results_2023-10-28T16-43-24.612769.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.15383808724832215, "em_stderr": 0.0036948628598682874, "f1": 0.22225880872483197, "f1_stderr": 0.0037670501187578413, "acc": 0.44797935342421163, "acc_stderr": 0.010609253699619367 }, "harness|drop|3": { "em": 0.15383808724832215, "em_stderr": 0.0036948628598682874, "f1": 0.22225880872483197, "f1_stderr": 0.0037670501187578413 }, "harness|gsm8k|5": { "acc": 0.13115996967399546, "acc_stderr": 0.00929849923558785 }, "harness|winogrande|5": { "acc": 0.7647987371744278, "acc_stderr": 0.011920008163650884 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,668
[ [ -0.033233642578125, -0.052154541015625, 0.0099029541015625, 0.01163482666015625, -0.01486968994140625, 0.00533294677734375, -0.0237884521484375, -0.0187225341796875, 0.0302734375, 0.039581298828125, -0.051361083984375, -0.0672607421875, -0.049896240234375, 0...
satoblack/asd
2023-10-08T17:46:41.000Z
[ "region:us" ]
satoblack
null
null
0
0
2023-10-08T17:46:41
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
SojeongKim/nlp_study
2023-10-08T17:53:03.000Z
[ "region:us" ]
SojeongKim
null
null
0
0
2023-10-08T17:53:03
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
open-llm-leaderboard/details_LeoLM__leo-hessianai-13b
2023-10-23T18:17:03.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T17:59:54
--- pretty_name: Evaluation run of LeoLM/leo-hessianai-13b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [LeoLM/leo-hessianai-13b](https://huggingface.co/LeoLM/leo-hessianai-13b) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeoLM__leo-hessianai-13b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-23T18:16:50.877675](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-13b/blob/main/results_2023-10-23T18-16-50.877675.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\ \ \"em_stderr\": 0.0003778609196460785,\n \"f1\": 0.05912332214765112,\n\ \ \"f1_stderr\": 0.001345589828621863,\n \"acc\": 0.425157060340252,\n\ \ \"acc_stderr\": 0.00992506244739182\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460785,\n\ \ \"f1\": 0.05912332214765112,\n \"f1_stderr\": 0.001345589828621863\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08946171341925702,\n \ \ \"acc_stderr\": 0.007861583049939733\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843905\n\ \ }\n}\n```" repo_url: https://huggingface.co/LeoLM/leo-hessianai-13b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|arc:challenge|25_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T17-59-31.182651.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_23T18_16_50.877675 path: - '**/details_harness|drop|3_2023-10-23T18-16-50.877675.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-23T18-16-50.877675.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_23T18_16_50.877675 path: - '**/details_harness|gsm8k|5_2023-10-23T18-16-50.877675.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-23T18-16-50.877675.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hellaswag|10_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-59-31.182651.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T17-59-31.182651.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T17_59_31.182651 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T17-59-31.182651.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T17-59-31.182651.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_23T18_16_50.877675 path: - '**/details_harness|winogrande|5_2023-10-23T18-16-50.877675.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-23T18-16-50.877675.parquet' - config_name: results data_files: - split: 2023_10_08T17_59_31.182651 path: - results_2023-10-08T17-59-31.182651.parquet - split: 2023_10_23T18_16_50.877675 path: - results_2023-10-23T18-16-50.877675.parquet - split: latest path: - results_2023-10-23T18-16-50.877675.parquet --- # Dataset Card for Evaluation run of LeoLM/leo-hessianai-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/LeoLM/leo-hessianai-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [LeoLM/leo-hessianai-13b](https://huggingface.co/LeoLM/leo-hessianai-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LeoLM__leo-hessianai-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T18:16:50.877675](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-13b/blob/main/results_2023-10-23T18-16-50.877675.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001363255033557047, "em_stderr": 0.0003778609196460785, "f1": 0.05912332214765112, "f1_stderr": 0.001345589828621863, "acc": 0.425157060340252, "acc_stderr": 0.00992506244739182 }, "harness|drop|3": { "em": 0.001363255033557047, "em_stderr": 0.0003778609196460785, "f1": 0.05912332214765112, "f1_stderr": 0.001345589828621863 }, "harness|gsm8k|5": { "acc": 0.08946171341925702, "acc_stderr": 0.007861583049939733 }, "harness|winogrande|5": { "acc": 0.760852407261247, "acc_stderr": 0.011988541844843905 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,590
[ [ -0.032012939453125, -0.04498291015625, 0.0066680908203125, 0.02105712890625, -0.009552001953125, 0.00489044189453125, -0.0281829833984375, -0.018463134765625, 0.031463623046875, 0.033538818359375, -0.05377197265625, -0.0665283203125, -0.04559326171875, 0.013...
open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test
2023-10-08T18:15:11.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T18:14:11
--- pretty_name: Evaluation run of Lazycuber/L2-7b-Guanaco-Random-Test dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Lazycuber/L2-7b-Guanaco-Random-Test](https://huggingface.co/Lazycuber/L2-7b-Guanaco-Random-Test)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-08T18:13:47.081600](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test/blob/main/results_2023-10-08T18-13-47.081600.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47820349788584665,\n\ \ \"acc_stderr\": 0.03520803674350638,\n \"acc_norm\": 0.4820937504834085,\n\ \ \"acc_norm_stderr\": 0.03519557788566828,\n \"mc1\": 0.27906976744186046,\n\ \ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.4232640996589444,\n\ \ \"mc2_stderr\": 0.01477991946603906\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.4761092150170648,\n \"acc_stderr\": 0.014594701798071654,\n\ \ \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255795\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5723959370643298,\n\ \ \"acc_stderr\": 0.004937199759947679,\n \"acc_norm\": 0.7720573590918144,\n\ \ \"acc_norm_stderr\": 0.004186480645315568\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\ \ \"acc_stderr\": 0.042849586397533994,\n \"acc_norm\": 0.43703703703703706,\n\ \ \"acc_norm_stderr\": 0.042849586397533994\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\ \ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\ \ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \ \ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n\ \ \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\ \ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.5138888888888888,\n\ \ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n\ \ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n\ \ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\ \ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n\ \ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\ \ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\ \ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\ \ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.31216931216931215,\n \"acc_stderr\": 0.0238652068369726,\n \"\ acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.0238652068369726\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\ \ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\ \ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n\ \ \"acc_stderr\": 0.028396016402761005,\n \"acc_norm\": 0.5290322580645161,\n\ \ \"acc_norm_stderr\": 0.028396016402761005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\ \ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\ : 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398395,\n\ \ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398395\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\ acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\ \ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331803,\n\ \ \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331803\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \ \ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n\ \ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\ acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.6642201834862386,\n \"acc_stderr\": 0.020248081396752927,\n \"\ acc_norm\": 0.6642201834862386,\n \"acc_norm_stderr\": 0.020248081396752927\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536016,\n \"\ acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536016\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6421568627450981,\n \"acc_stderr\": 0.03364487286088298,\n \"\ acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.03364487286088298\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6286919831223629,\n \"acc_stderr\": 0.0314506860074486,\n \ \ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.0314506860074486\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\ \ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\ \ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n\ \ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212093,\n \"\ acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212093\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\ \ \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.5925925925925926,\n\ \ \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\ \ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\ \ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\ \ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n\ \ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\ \ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.7307692307692307,\n\ \ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \ \ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\ \ \"acc_stderr\": 0.016617501738763387,\n \"acc_norm\": 0.6845466155810983,\n\ \ \"acc_norm_stderr\": 0.016617501738763387\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.02688264343402289,\n\ \ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.02688264343402289\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22681564245810057,\n\ \ \"acc_stderr\": 0.014005843570897895,\n \"acc_norm\": 0.22681564245810057,\n\ \ \"acc_norm_stderr\": 0.014005843570897895\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\ \ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\ \ \"acc_stderr\": 0.02827435985489426,\n \"acc_norm\": 0.5466237942122186,\n\ \ \"acc_norm_stderr\": 0.02827435985489426\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668763,\n\ \ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668763\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199495,\n \ \ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199495\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32790091264667537,\n\ \ \"acc_stderr\": 0.011989936640666525,\n \"acc_norm\": 0.32790091264667537,\n\ \ \"acc_norm_stderr\": 0.011989936640666525\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280065,\n\ \ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280065\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.46895424836601307,\n \"acc_stderr\": 0.020188804456361883,\n \ \ \"acc_norm\": 0.46895424836601307,\n \"acc_norm_stderr\": 0.020188804456361883\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\ \ \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n\ \ \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.03197694118713672,\n\ \ \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.03197694118713672\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\ \ \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n\ \ \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \ \ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\ \ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\ \ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\ \ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\ \ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.4232640996589444,\n\ \ \"mc2_stderr\": 0.01477991946603906\n }\n}\n```" repo_url: https://huggingface.co/Lazycuber/L2-7b-Guanaco-Random-Test leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|arc:challenge|25_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hellaswag|10_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-13-47.081600.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-13-47.081600.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T18_13_47.081600 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T18-13-47.081600.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T18-13-47.081600.parquet' - config_name: results data_files: - split: 2023_10_08T18_13_47.081600 path: - results_2023-10-08T18-13-47.081600.parquet - split: latest path: - results_2023-10-08T18-13-47.081600.parquet --- # Dataset Card for Evaluation run of Lazycuber/L2-7b-Guanaco-Random-Test ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Lazycuber/L2-7b-Guanaco-Random-Test - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Lazycuber/L2-7b-Guanaco-Random-Test](https://huggingface.co/Lazycuber/L2-7b-Guanaco-Random-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-08T18:13:47.081600](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Random-Test/blob/main/results_2023-10-08T18-13-47.081600.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.47820349788584665, "acc_stderr": 0.03520803674350638, "acc_norm": 0.4820937504834085, "acc_norm_stderr": 0.03519557788566828, "mc1": 0.27906976744186046, "mc1_stderr": 0.0157021070906279, "mc2": 0.4232640996589444, "mc2_stderr": 0.01477991946603906 }, "harness|arc:challenge|25": { "acc": 0.4761092150170648, "acc_stderr": 0.014594701798071654, "acc_norm": 0.5059726962457338, "acc_norm_stderr": 0.014610348300255795 }, "harness|hellaswag|10": { "acc": 0.5723959370643298, "acc_stderr": 0.004937199759947679, "acc_norm": 0.7720573590918144, "acc_norm_stderr": 0.004186480645315568 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.042849586397533994, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.042849586397533994 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5131578947368421, "acc_stderr": 0.04067533136309173, "acc_norm": 0.5131578947368421, "acc_norm_stderr": 0.04067533136309173 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5169811320754717, "acc_stderr": 0.030755120364119905, "acc_norm": 0.5169811320754717, "acc_norm_stderr": 0.030755120364119905 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5138888888888888, "acc_stderr": 0.041795966175810016, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.041795966175810016 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3699421965317919, "acc_stderr": 0.036812296333943194, "acc_norm": 0.3699421965317919, "acc_norm_stderr": 0.036812296333943194 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.24509803921568626, "acc_stderr": 0.042801058373643966, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.042801058373643966 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.425531914893617, "acc_stderr": 0.03232146916224469, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.03232146916224469 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.04579639422070434, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.041665675771015785, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.041665675771015785 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.31216931216931215, "acc_stderr": 0.0238652068369726, "acc_norm": 0.31216931216931215, "acc_norm_stderr": 0.0238652068369726 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.24603174603174602, "acc_stderr": 0.038522733649243156, "acc_norm": 0.24603174603174602, "acc_norm_stderr": 0.038522733649243156 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5290322580645161, "acc_stderr": 0.028396016402761005, "acc_norm": 0.5290322580645161, "acc_norm_stderr": 0.028396016402761005 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3793103448275862, "acc_stderr": 0.03413963805906235, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.03413963805906235 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5818181818181818, "acc_stderr": 0.03851716319398395, "acc_norm": 0.5818181818181818, "acc_norm_stderr": 0.03851716319398395 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5909090909090909, "acc_stderr": 0.03502975799413007, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.03502975799413007 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6683937823834197, "acc_stderr": 0.03397636541089118, "acc_norm": 0.6683937823834197, "acc_norm_stderr": 0.03397636541089118 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4128205128205128, "acc_stderr": 0.024962683564331803, "acc_norm": 0.4128205128205128, "acc_norm_stderr": 0.024962683564331803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085626, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085626 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3907563025210084, "acc_stderr": 0.031693802357129965, "acc_norm": 0.3907563025210084, "acc_norm_stderr": 0.031693802357129965 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.03802039760107903, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.03802039760107903 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6642201834862386, "acc_stderr": 0.020248081396752927, "acc_norm": 0.6642201834862386, "acc_norm_stderr": 0.020248081396752927 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.031141447823536016, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.031141447823536016 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6421568627450981, "acc_stderr": 0.03364487286088298, "acc_norm": 0.6421568627450981, "acc_norm_stderr": 0.03364487286088298 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6286919831223629, "acc_stderr": 0.0314506860074486, "acc_norm": 0.6286919831223629, "acc_norm_stderr": 0.0314506860074486 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5560538116591929, "acc_stderr": 0.03334625674242728, "acc_norm": 0.5560538116591929, "acc_norm_stderr": 0.03334625674242728 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5572519083969466, "acc_stderr": 0.04356447202665069, "acc_norm": 0.5572519083969466, "acc_norm_stderr": 0.04356447202665069 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6694214876033058, "acc_stderr": 0.04294340845212093, "acc_norm": 0.6694214876033058, "acc_norm_stderr": 0.04294340845212093 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04750077341199984, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04750077341199984 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5153374233128835, "acc_stderr": 0.03926522378708843, "acc_norm": 0.5153374233128835, "acc_norm_stderr": 0.03926522378708843 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04547960999764376, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.6407766990291263, "acc_stderr": 0.047504583990416946, "acc_norm": 0.6407766990291263, "acc_norm_stderr": 0.047504583990416946 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7307692307692307, "acc_stderr": 0.029058588303748842, "acc_norm": 0.7307692307692307, "acc_norm_stderr": 0.029058588303748842 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956913, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6845466155810983, "acc_stderr": 0.016617501738763387, "acc_norm": 0.6845466155810983, "acc_norm_stderr": 0.016617501738763387 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5260115606936416, "acc_stderr": 0.02688264343402289, "acc_norm": 0.5260115606936416, "acc_norm_stderr": 0.02688264343402289 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.22681564245810057, "acc_stderr": 0.014005843570897895, "acc_norm": 0.22681564245810057, "acc_norm_stderr": 0.014005843570897895 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5392156862745098, "acc_stderr": 0.028541722692618874, "acc_norm": 0.5392156862745098, "acc_norm_stderr": 0.028541722692618874 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5466237942122186, "acc_stderr": 0.02827435985489426, "acc_norm": 0.5466237942122186, "acc_norm_stderr": 0.02827435985489426 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.558641975308642, "acc_stderr": 0.027628737155668763, "acc_norm": 0.558641975308642, "acc_norm_stderr": 0.027628737155668763 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3617021276595745, "acc_stderr": 0.028663820147199495, "acc_norm": 0.3617021276595745, "acc_norm_stderr": 0.028663820147199495 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.32790091264667537, "acc_stderr": 0.011989936640666525, "acc_norm": 0.32790091264667537, "acc_norm_stderr": 0.011989936640666525 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.39705882352941174, "acc_stderr": 0.029722152099280065, "acc_norm": 0.39705882352941174, "acc_norm_stderr": 0.029722152099280065 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.46895424836601307, "acc_stderr": 0.020188804456361883, "acc_norm": 0.46895424836601307, "acc_norm_stderr": 0.020188804456361883 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.509090909090909, "acc_stderr": 0.0478833976870286, "acc_norm": 0.509090909090909, "acc_norm_stderr": 0.0478833976870286 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5224489795918368, "acc_stderr": 0.03197694118713672, "acc_norm": 0.5224489795918368, "acc_norm_stderr": 0.03197694118713672 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6218905472636815, "acc_stderr": 0.034288678487786564, "acc_norm": 0.6218905472636815, "acc_norm_stderr": 0.034288678487786564 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685517, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.0352821125824523, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.0352821125824523 }, "harness|truthfulqa:mc|0": { "mc1": 0.27906976744186046, "mc1_stderr": 0.0157021070906279, "mc2": 0.4232640996589444, "mc2_stderr": 0.01477991946603906 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
65,065
[ [ -0.048095703125, -0.0633544921875, 0.019775390625, 0.015838623046875, -0.01068878173828125, -0.00519561767578125, 0.00002396106719970703, -0.01551055908203125, 0.03948974609375, -0.006221771240234375, -0.03289794921875, -0.046051025390625, -0.029296875, 0.01...
open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0
2023-10-24T07:06:16.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T18:14:55
--- pretty_name: Evaluation run of jondurbin/airoboros-l2-13b-3.0 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [jondurbin/airoboros-l2-13b-3.0](https://huggingface.co/jondurbin/airoboros-l2-13b-3.0)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T07:06:03.975558](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0/blob/main/results_2023-10-24T07-06-03.975558.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20868288590604026,\n\ \ \"em_stderr\": 0.004161580956848853,\n \"f1\": 0.26992973993288605,\n\ \ \"f1_stderr\": 0.004166447885566019,\n \"acc\": 0.4255516933315701,\n\ \ \"acc_stderr\": 0.009918265858821027\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.20868288590604026,\n \"em_stderr\": 0.004161580956848853,\n\ \ \"f1\": 0.26992973993288605,\n \"f1_stderr\": 0.004166447885566019\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08946171341925702,\n \ \ \"acc_stderr\": 0.007861583049939738\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702314\n\ \ }\n}\n```" repo_url: https://huggingface.co/jondurbin/airoboros-l2-13b-3.0 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|arc:challenge|25_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T18-14-31.712178.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T07_06_03.975558 path: - '**/details_harness|drop|3_2023-10-24T07-06-03.975558.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T07-06-03.975558.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T07_06_03.975558 path: - '**/details_harness|gsm8k|5_2023-10-24T07-06-03.975558.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T07-06-03.975558.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hellaswag|10_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-14-31.712178.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-14-31.712178.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T18_14_31.712178 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T18-14-31.712178.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T18-14-31.712178.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T07_06_03.975558 path: - '**/details_harness|winogrande|5_2023-10-24T07-06-03.975558.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T07-06-03.975558.parquet' - config_name: results data_files: - split: 2023_10_08T18_14_31.712178 path: - results_2023-10-08T18-14-31.712178.parquet - split: 2023_10_24T07_06_03.975558 path: - results_2023-10-24T07-06-03.975558.parquet - split: latest path: - results_2023-10-24T07-06-03.975558.parquet --- # Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-3.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jondurbin/airoboros-l2-13b-3.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-3.0](https://huggingface.co/jondurbin/airoboros-l2-13b-3.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T07:06:03.975558](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-3.0/blob/main/results_2023-10-24T07-06-03.975558.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.20868288590604026, "em_stderr": 0.004161580956848853, "f1": 0.26992973993288605, "f1_stderr": 0.004166447885566019, "acc": 0.4255516933315701, "acc_stderr": 0.009918265858821027 }, "harness|drop|3": { "em": 0.20868288590604026, "em_stderr": 0.004161580956848853, "f1": 0.26992973993288605, "f1_stderr": 0.004166447885566019 }, "harness|gsm8k|5": { "acc": 0.08946171341925702, "acc_stderr": 0.007861583049939738 }, "harness|winogrande|5": { "acc": 0.7616416732438832, "acc_stderr": 0.011974948667702314 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,672
[ [ -0.032379150390625, -0.0484619140625, 0.00685882568359375, 0.0159759521484375, -0.01114654541015625, 0.00495147705078125, -0.02691650390625, -0.0160675048828125, 0.0305328369140625, 0.041259765625, -0.0478515625, -0.06475830078125, -0.051544189453125, 0.0132...
SumanthRH/rte_few_shot_arc
2023-10-08T18:26:17.000Z
[ "region:us" ]
SumanthRH
null
null
0
0
2023-10-08T18:24:42
--- dataset_info: features: - name: problem dtype: string - name: solution dtype: string - name: response dtype: string - name: message list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 437310 num_examples: 200 download_size: 125236 dataset_size: 437310 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "rte_few_shot_arc" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
622
[ [ -0.0443115234375, -0.0299224853515625, 0.009521484375, 0.00406646728515625, -0.0178070068359375, -0.019378662109375, 0.027374267578125, -0.01062774658203125, 0.05291748046875, 0.0433349609375, -0.04803466796875, -0.0645751953125, -0.045867919921875, -0.00991...
open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30
2023-10-27T13:39:08.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T18:25:46
--- pretty_name: Evaluation run of JosephusCheung/Pwen-14B-Chat-20_30 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [JosephusCheung/Pwen-14B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-14B-Chat-20_30)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-27T13:38:56.103845](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30/blob/main/results_2023-10-27T13-38-56.103845.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2828229865771812,\n\ \ \"em_stderr\": 0.004612221798127954,\n \"f1\": 0.3398972315436241,\n\ \ \"f1_stderr\": 0.004521141568402689,\n \"acc\": 0.5173500888298219,\n\ \ \"acc_stderr\": 0.012073725510059884\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.2828229865771812,\n \"em_stderr\": 0.004612221798127954,\n\ \ \"f1\": 0.3398972315436241,\n \"f1_stderr\": 0.004521141568402689\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2699014404852161,\n \ \ \"acc_stderr\": 0.012227442856468897\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650872\n\ \ }\n}\n```" repo_url: https://huggingface.co/JosephusCheung/Pwen-14B-Chat-20_30 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|arc:challenge|25_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T18-25-24.586385.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_27T13_38_56.103845 path: - '**/details_harness|drop|3_2023-10-27T13-38-56.103845.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-27T13-38-56.103845.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_27T13_38_56.103845 path: - '**/details_harness|gsm8k|5_2023-10-27T13-38-56.103845.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-27T13-38-56.103845.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hellaswag|10_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-25-24.586385.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-25-24.586385.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T18_25_24.586385 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T18-25-24.586385.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T18-25-24.586385.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_27T13_38_56.103845 path: - '**/details_harness|winogrande|5_2023-10-27T13-38-56.103845.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-27T13-38-56.103845.parquet' - config_name: results data_files: - split: 2023_10_08T18_25_24.586385 path: - results_2023-10-08T18-25-24.586385.parquet - split: 2023_10_27T13_38_56.103845 path: - results_2023-10-27T13-38-56.103845.parquet - split: latest path: - results_2023-10-27T13-38-56.103845.parquet --- # Dataset Card for Evaluation run of JosephusCheung/Pwen-14B-Chat-20_30 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/JosephusCheung/Pwen-14B-Chat-20_30 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [JosephusCheung/Pwen-14B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-14B-Chat-20_30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T13:38:56.103845](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-14B-Chat-20_30/blob/main/results_2023-10-27T13-38-56.103845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2828229865771812, "em_stderr": 0.004612221798127954, "f1": 0.3398972315436241, "f1_stderr": 0.004521141568402689, "acc": 0.5173500888298219, "acc_stderr": 0.012073725510059884 }, "harness|drop|3": { "em": 0.2828229865771812, "em_stderr": 0.004612221798127954, "f1": 0.3398972315436241, "f1_stderr": 0.004521141568402689 }, "harness|gsm8k|5": { "acc": 0.2699014404852161, "acc_stderr": 0.012227442856468897 }, "harness|winogrande|5": { "acc": 0.7647987371744278, "acc_stderr": 0.011920008163650872 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,710
[ [ -0.02508544921875, -0.053009033203125, 0.01093292236328125, 0.0206451416015625, -0.0080413818359375, 0.006923675537109375, -0.03546142578125, -0.0123443603515625, 0.033935546875, 0.041595458984375, -0.04913330078125, -0.0673828125, -0.05035400390625, 0.01066...
open-llm-leaderboard/details_Writer__palmyra-20b-chat
2023-10-24T17:35:00.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T18:46:21
--- pretty_name: Evaluation run of Writer/palmyra-20b-chat dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Writer/palmyra-20b-chat](https://huggingface.co/Writer/palmyra-20b-chat) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__palmyra-20b-chat\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T17:34:48.335583](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-20b-chat/blob/main/results_2023-10-24T17-34-48.335583.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01373741610738255,\n\ \ \"em_stderr\": 0.0011920334890960986,\n \"f1\": 0.07696308724832225,\n\ \ \"f1_stderr\": 0.0018555585236602612,\n \"acc\": 0.3519928816466039,\n\ \ \"acc_stderr\": 0.009314927967596935\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.01373741610738255,\n \"em_stderr\": 0.0011920334890960986,\n\ \ \"f1\": 0.07696308724832225,\n \"f1_stderr\": 0.0018555585236602612\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.039423805913570885,\n \ \ \"acc_stderr\": 0.005360280030342453\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.664561957379637,\n \"acc_stderr\": 0.013269575904851418\n\ \ }\n}\n```" repo_url: https://huggingface.co/Writer/palmyra-20b-chat leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|arc:challenge|25_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T18-46-04.606475.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T17_34_48.335583 path: - '**/details_harness|drop|3_2023-10-24T17-34-48.335583.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T17-34-48.335583.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T17_34_48.335583 path: - '**/details_harness|gsm8k|5_2023-10-24T17-34-48.335583.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T17-34-48.335583.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hellaswag|10_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T18-46-04.606475.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T18_46_04.606475 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T18-46-04.606475.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T18-46-04.606475.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T17_34_48.335583 path: - '**/details_harness|winogrande|5_2023-10-24T17-34-48.335583.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T17-34-48.335583.parquet' - config_name: results data_files: - split: 2023_10_08T18_46_04.606475 path: - results_2023-10-08T18-46-04.606475.parquet - split: 2023_10_24T17_34_48.335583 path: - results_2023-10-24T17-34-48.335583.parquet - split: latest path: - results_2023-10-24T17-34-48.335583.parquet --- # Dataset Card for Evaluation run of Writer/palmyra-20b-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Writer/palmyra-20b-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Writer/palmyra-20b-chat](https://huggingface.co/Writer/palmyra-20b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Writer__palmyra-20b-chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T17:34:48.335583](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-20b-chat/blob/main/results_2023-10-24T17-34-48.335583.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.01373741610738255, "em_stderr": 0.0011920334890960986, "f1": 0.07696308724832225, "f1_stderr": 0.0018555585236602612, "acc": 0.3519928816466039, "acc_stderr": 0.009314927967596935 }, "harness|drop|3": { "em": 0.01373741610738255, "em_stderr": 0.0011920334890960986, "f1": 0.07696308724832225, "f1_stderr": 0.0018555585236602612 }, "harness|gsm8k|5": { "acc": 0.039423805913570885, "acc_stderr": 0.005360280030342453 }, "harness|winogrande|5": { "acc": 0.664561957379637, "acc_stderr": 0.013269575904851418 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,596
[ [ -0.026275634765625, -0.053741455078125, 0.0145721435546875, 0.0198822021484375, -0.0042877197265625, 0.01019287109375, -0.03643798828125, -0.017578125, 0.0286407470703125, 0.04669189453125, -0.04888916015625, -0.07135009765625, -0.050079345703125, 0.01369476...
open-llm-leaderboard/details_harborwater__wizard-orca-3b
2023-10-24T08:46:12.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T19:21:36
--- pretty_name: Evaluation run of harborwater/wizard-orca-3b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [harborwater/wizard-orca-3b](https://huggingface.co/harborwater/wizard-orca-3b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harborwater__wizard-orca-3b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T08:46:00.865464](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__wizard-orca-3b/blob/main/results_2023-10-24T08-46-00.865464.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\ \ \"em_stderr\": 0.00045666764626669333,\n \"f1\": 0.05503670302013434,\n\ \ \"f1_stderr\": 0.0013533156474354355,\n \"acc\": 0.33995582743378455,\n\ \ \"acc_stderr\": 0.008022574604695198\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626669333,\n\ \ \"f1\": 0.05503670302013434,\n \"f1_stderr\": 0.0013533156474354355\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \ \ \"acc_stderr\": 0.002822713322387704\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.6692975532754538,\n \"acc_stderr\": 0.013222435887002691\n\ \ }\n}\n```" repo_url: https://huggingface.co/harborwater/wizard-orca-3b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|arc:challenge|25_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T19-21-18.723038.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T08_46_00.865464 path: - '**/details_harness|drop|3_2023-10-24T08-46-00.865464.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T08-46-00.865464.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T08_46_00.865464 path: - '**/details_harness|gsm8k|5_2023-10-24T08-46-00.865464.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T08-46-00.865464.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hellaswag|10_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-21-18.723038.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-21-18.723038.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T19_21_18.723038 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-21-18.723038.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-21-18.723038.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T08_46_00.865464 path: - '**/details_harness|winogrande|5_2023-10-24T08-46-00.865464.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T08-46-00.865464.parquet' - config_name: results data_files: - split: 2023_10_08T19_21_18.723038 path: - results_2023-10-08T19-21-18.723038.parquet - split: 2023_10_24T08_46_00.865464 path: - results_2023-10-24T08-46-00.865464.parquet - split: latest path: - results_2023-10-24T08-46-00.865464.parquet --- # Dataset Card for Evaluation run of harborwater/wizard-orca-3b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/harborwater/wizard-orca-3b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [harborwater/wizard-orca-3b](https://huggingface.co/harborwater/wizard-orca-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_harborwater__wizard-orca-3b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T08:46:00.865464](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__wizard-orca-3b/blob/main/results_2023-10-24T08-46-00.865464.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0019924496644295304, "em_stderr": 0.00045666764626669333, "f1": 0.05503670302013434, "f1_stderr": 0.0013533156474354355, "acc": 0.33995582743378455, "acc_stderr": 0.008022574604695198 }, "harness|drop|3": { "em": 0.0019924496644295304, "em_stderr": 0.00045666764626669333, "f1": 0.05503670302013434, "f1_stderr": 0.0013533156474354355 }, "harness|gsm8k|5": { "acc": 0.01061410159211524, "acc_stderr": 0.002822713322387704 }, "harness|winogrande|5": { "acc": 0.6692975532754538, "acc_stderr": 0.013222435887002691 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,646
[ [ -0.0267791748046875, -0.05145263671875, 0.01312255859375, 0.01427459716796875, -0.00829315185546875, 0.00439453125, -0.0166473388671875, -0.016387939453125, 0.0245513916015625, 0.04620361328125, -0.046051025390625, -0.073486328125, -0.047821044921875, 0.0116...
toninhodjj/morgan
2023-10-08T19:27:02.000Z
[ "region:us" ]
toninhodjj
null
null
0
0
2023-10-08T19:22:56
Entry not found
15
[ [ -0.0213775634765625, -0.014984130859375, 0.05718994140625, 0.0288543701171875, -0.0350341796875, 0.046478271484375, 0.052520751953125, 0.005062103271484375, 0.051361083984375, 0.016998291015625, -0.0521240234375, -0.01496124267578125, -0.0604248046875, 0.037...
open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b
2023-10-26T12:10:51.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T19:36:13
--- pretty_name: Evaluation run of chargoddard/duplicitous-slurpbeast-13b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [chargoddard/duplicitous-slurpbeast-13b](https://huggingface.co/chargoddard/duplicitous-slurpbeast-13b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T12:10:38.195509](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b/blob/main/results_2023-10-26T12-10-38.195509.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.022651006711409395,\n\ \ \"em_stderr\": 0.0015237307803438113,\n \"f1\": 0.10978607382550301,\n\ \ \"f1_stderr\": 0.0022271926416287282,\n \"acc\": 0.41926868133939454,\n\ \ \"acc_stderr\": 0.009980675697209198\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.022651006711409395,\n \"em_stderr\": 0.0015237307803438113,\n\ \ \"f1\": 0.10978607382550301,\n \"f1_stderr\": 0.0022271926416287282\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08794541319181198,\n \ \ \"acc_stderr\": 0.007801162197487707\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.012160189196930689\n\ \ }\n}\n```" repo_url: https://huggingface.co/chargoddard/duplicitous-slurpbeast-13b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|arc:challenge|25_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T19-35-50.428127.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T12_10_38.195509 path: - '**/details_harness|drop|3_2023-10-26T12-10-38.195509.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T12-10-38.195509.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T12_10_38.195509 path: - '**/details_harness|gsm8k|5_2023-10-26T12-10-38.195509.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T12-10-38.195509.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hellaswag|10_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-35-50.428127.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-35-50.428127.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T19_35_50.428127 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-35-50.428127.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-35-50.428127.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T12_10_38.195509 path: - '**/details_harness|winogrande|5_2023-10-26T12-10-38.195509.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T12-10-38.195509.parquet' - config_name: results data_files: - split: 2023_10_08T19_35_50.428127 path: - results_2023-10-08T19-35-50.428127.parquet - split: 2023_10_26T12_10_38.195509 path: - results_2023-10-26T12-10-38.195509.parquet - split: latest path: - results_2023-10-26T12-10-38.195509.parquet --- # Dataset Card for Evaluation run of chargoddard/duplicitous-slurpbeast-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/chargoddard/duplicitous-slurpbeast-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [chargoddard/duplicitous-slurpbeast-13b](https://huggingface.co/chargoddard/duplicitous-slurpbeast-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T12:10:38.195509](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__duplicitous-slurpbeast-13b/blob/main/results_2023-10-26T12-10-38.195509.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.022651006711409395, "em_stderr": 0.0015237307803438113, "f1": 0.10978607382550301, "f1_stderr": 0.0022271926416287282, "acc": 0.41926868133939454, "acc_stderr": 0.009980675697209198 }, "harness|drop|3": { "em": 0.022651006711409395, "em_stderr": 0.0015237307803438113, "f1": 0.10978607382550301, "f1_stderr": 0.0022271926416287282 }, "harness|gsm8k|5": { "acc": 0.08794541319181198, "acc_stderr": 0.007801162197487707 }, "harness|winogrande|5": { "acc": 0.7505919494869772, "acc_stderr": 0.012160189196930689 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,782
[ [ -0.027679443359375, -0.0487060546875, 0.014312744140625, 0.015289306640625, -0.0117950439453125, 0.01325225830078125, -0.02227783203125, -0.012481689453125, 0.034942626953125, 0.046875, -0.050140380859375, -0.0682373046875, -0.052215576171875, 0.008506774902...
open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b
2023-10-28T12:16:47.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T19:36:39
--- pretty_name: Evaluation run of chargoddard/duplicitous-mammal-13b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [chargoddard/duplicitous-mammal-13b](https://huggingface.co/chargoddard/duplicitous-mammal-13b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T12:16:35.261597](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b/blob/main/results_2023-10-28T12-16-35.261597.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.011115771812080536,\n\ \ \"em_stderr\": 0.0010736981082190872,\n \"f1\": 0.09195050335570452,\n\ \ \"f1_stderr\": 0.0019265640812138418,\n \"acc\": 0.42078498156683963,\n\ \ \"acc_stderr\": 0.01004075602047218\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.011115771812080536,\n \"em_stderr\": 0.0010736981082190872,\n\ \ \"f1\": 0.09195050335570452,\n \"f1_stderr\": 0.0019265640812138418\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09097801364670205,\n \ \ \"acc_stderr\": 0.00792132284401367\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.012160189196930689\n\ \ }\n}\n```" repo_url: https://huggingface.co/chargoddard/duplicitous-mammal-13b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|arc:challenge|25_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T19-36-16.264447.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T12_16_35.261597 path: - '**/details_harness|drop|3_2023-10-28T12-16-35.261597.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T12-16-35.261597.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T12_16_35.261597 path: - '**/details_harness|gsm8k|5_2023-10-28T12-16-35.261597.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T12-16-35.261597.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hellaswag|10_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-36-16.264447.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-36-16.264447.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T19_36_16.264447 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-36-16.264447.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-36-16.264447.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T12_16_35.261597 path: - '**/details_harness|winogrande|5_2023-10-28T12-16-35.261597.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T12-16-35.261597.parquet' - config_name: results data_files: - split: 2023_10_08T19_36_16.264447 path: - results_2023-10-08T19-36-16.264447.parquet - split: 2023_10_28T12_16_35.261597 path: - results_2023-10-28T12-16-35.261597.parquet - split: latest path: - results_2023-10-28T12-16-35.261597.parquet --- # Dataset Card for Evaluation run of chargoddard/duplicitous-mammal-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/chargoddard/duplicitous-mammal-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [chargoddard/duplicitous-mammal-13b](https://huggingface.co/chargoddard/duplicitous-mammal-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T12:16:35.261597](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__duplicitous-mammal-13b/blob/main/results_2023-10-28T12-16-35.261597.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.011115771812080536, "em_stderr": 0.0010736981082190872, "f1": 0.09195050335570452, "f1_stderr": 0.0019265640812138418, "acc": 0.42078498156683963, "acc_stderr": 0.01004075602047218 }, "harness|drop|3": { "em": 0.011115771812080536, "em_stderr": 0.0010736981082190872, "f1": 0.09195050335570452, "f1_stderr": 0.0019265640812138418 }, "harness|gsm8k|5": { "acc": 0.09097801364670205, "acc_stderr": 0.00792132284401367 }, "harness|winogrande|5": { "acc": 0.7505919494869772, "acc_stderr": 0.012160189196930689 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,730
[ [ -0.0303192138671875, -0.0472412109375, 0.01300811767578125, 0.0212860107421875, -0.0159759521484375, 0.01107025146484375, -0.023345947265625, -0.01485443115234375, 0.034576416015625, 0.044525146484375, -0.0533447265625, -0.06927490234375, -0.0482177734375, 0...
open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2
2023-10-23T09:23:39.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T19:44:10
--- pretty_name: Evaluation run of caisarl76/mistral-guanaco1k-ep2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [caisarl76/mistral-guanaco1k-ep2](https://huggingface.co/caisarl76/mistral-guanaco1k-ep2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-23T09:23:27.252152](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2/blob/main/results_2023-10-23T09-23-27.252152.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\ \ \"em_stderr\": 0.0004913221265094507,\n \"f1\": 0.06542994966442944,\n\ \ \"f1_stderr\": 0.001488633695023099,\n \"acc\": 0.4501858873976542,\n\ \ \"acc_stderr\": 0.010287740882080417\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094507,\n\ \ \"f1\": 0.06542994966442944,\n \"f1_stderr\": 0.001488633695023099\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \ \ \"acc_stderr\": 0.008944213403553055\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\ \ }\n}\n```" repo_url: https://huggingface.co/caisarl76/mistral-guanaco1k-ep2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|arc:challenge|25_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T19-43-46.755661.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_23T09_23_27.252152 path: - '**/details_harness|drop|3_2023-10-23T09-23-27.252152.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-23T09-23-27.252152.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_23T09_23_27.252152 path: - '**/details_harness|gsm8k|5_2023-10-23T09-23-27.252152.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-23T09-23-27.252152.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hellaswag|10_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-43-46.755661.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-43-46.755661.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T19_43_46.755661 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-43-46.755661.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-43-46.755661.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_23T09_23_27.252152 path: - '**/details_harness|winogrande|5_2023-10-23T09-23-27.252152.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-23T09-23-27.252152.parquet' - config_name: results data_files: - split: 2023_10_08T19_43_46.755661 path: - results_2023-10-08T19-43-46.755661.parquet - split: 2023_10_23T09_23_27.252152 path: - results_2023-10-23T09-23-27.252152.parquet - split: latest path: - results_2023-10-23T09-23-27.252152.parquet --- # Dataset Card for Evaluation run of caisarl76/mistral-guanaco1k-ep2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/caisarl76/mistral-guanaco1k-ep2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [caisarl76/mistral-guanaco1k-ep2](https://huggingface.co/caisarl76/mistral-guanaco1k-ep2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T09:23:27.252152](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2/blob/main/results_2023-10-23T09-23-27.252152.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002307046979865772, "em_stderr": 0.0004913221265094507, "f1": 0.06542994966442944, "f1_stderr": 0.001488633695023099, "acc": 0.4501858873976542, "acc_stderr": 0.010287740882080417 }, "harness|drop|3": { "em": 0.002307046979865772, "em_stderr": 0.0004913221265094507, "f1": 0.06542994966442944, "f1_stderr": 0.001488633695023099 }, "harness|gsm8k|5": { "acc": 0.1197877179681577, "acc_stderr": 0.008944213403553055 }, "harness|winogrande|5": { "acc": 0.7805840568271507, "acc_stderr": 0.01163126836060778 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,688
[ [ -0.0269775390625, -0.0445556640625, 0.01177978515625, 0.0200347900390625, -0.013427734375, 0.0024166107177734375, -0.0234375, -0.010101318359375, 0.025848388671875, 0.039215087890625, -0.05133056640625, -0.0703125, -0.050140380859375, 0.00821685791015625, ...
Ayansk11/test
2023-10-08T19:50:56.000Z
[ "region:us" ]
Ayansk11
null
null
0
0
2023-10-08T19:45:18
Entry not found
15
[ [ -0.0213775634765625, -0.014984130859375, 0.05718994140625, 0.0288543701171875, -0.0350341796875, 0.046478271484375, 0.052520751953125, 0.005062103271484375, 0.051361083984375, 0.016998291015625, -0.0521240234375, -0.01496124267578125, -0.0604248046875, 0.037...
Ayansk11/llama2_merged_file11
2023-10-08T19:52:41.000Z
[ "region:us" ]
Ayansk11
null
null
0
0
2023-10-08T19:51:42
Entry not found
15
[ [ -0.0213775634765625, -0.014984130859375, 0.05718994140625, 0.0288543701171875, -0.0350341796875, 0.046478271484375, 0.052520751953125, 0.005062103271484375, 0.051361083984375, 0.016998291015625, -0.0521240234375, -0.01496124267578125, -0.0604248046875, 0.037...
open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo
2023-10-25T22:40:48.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T19:52:47
--- pretty_name: Evaluation run of beaugogh/Llama2-7b-openorca-mc-v2-dpo dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [beaugogh/Llama2-7b-openorca-mc-v2-dpo](https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2-dpo)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-25T22:40:34.930470](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo/blob/main/results_2023-10-25T22-40-34.930470.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\ \ \"em_stderr\": 0.0003630560893119234,\n \"f1\": 0.05640729865771826,\n\ \ \"f1_stderr\": 0.0013382113030996202,\n \"acc\": 0.38661167934139673,\n\ \ \"acc_stderr\": 0.00909660619315009\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119234,\n\ \ \"f1\": 0.05640729865771826,\n \"f1_stderr\": 0.0013382113030996202\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04473085670962851,\n \ \ \"acc_stderr\": 0.005693886131407052\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893129\n\ \ }\n}\n```" repo_url: https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2-dpo leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|arc:challenge|25_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T19-52-28.810718.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_25T22_40_34.930470 path: - '**/details_harness|drop|3_2023-10-25T22-40-34.930470.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-25T22-40-34.930470.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_25T22_40_34.930470 path: - '**/details_harness|gsm8k|5_2023-10-25T22-40-34.930470.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-25T22-40-34.930470.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hellaswag|10_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-52-28.810718.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T19-52-28.810718.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T19_52_28.810718 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-52-28.810718.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T19-52-28.810718.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_25T22_40_34.930470 path: - '**/details_harness|winogrande|5_2023-10-25T22-40-34.930470.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-25T22-40-34.930470.parquet' - config_name: results data_files: - split: 2023_10_08T19_52_28.810718 path: - results_2023-10-08T19-52-28.810718.parquet - split: 2023_10_25T22_40_34.930470 path: - results_2023-10-25T22-40-34.930470.parquet - split: latest path: - results_2023-10-25T22-40-34.930470.parquet --- # Dataset Card for Evaluation run of beaugogh/Llama2-7b-openorca-mc-v2-dpo ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2-dpo - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [beaugogh/Llama2-7b-openorca-mc-v2-dpo](https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T22:40:34.930470](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2-dpo/blob/main/results_2023-10-25T22-40-34.930470.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119234, "f1": 0.05640729865771826, "f1_stderr": 0.0013382113030996202, "acc": 0.38661167934139673, "acc_stderr": 0.00909660619315009 }, "harness|drop|3": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119234, "f1": 0.05640729865771826, "f1_stderr": 0.0013382113030996202 }, "harness|gsm8k|5": { "acc": 0.04473085670962851, "acc_stderr": 0.005693886131407052 }, "harness|winogrande|5": { "acc": 0.728492501973165, "acc_stderr": 0.012499326254893129 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,770
[ [ -0.0300445556640625, -0.050201416015625, 0.0173797607421875, 0.016937255859375, -0.00965118408203125, 0.0138702392578125, -0.0226287841796875, -0.0178680419921875, 0.0298004150390625, 0.041778564453125, -0.047760009765625, -0.06585693359375, -0.045440673828125, ...
open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf
2023-10-26T05:18:07.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T20:04:58
--- pretty_name: Evaluation run of ibranze/araproje-llama2-7b-hf dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [ibranze/araproje-llama2-7b-hf](https://huggingface.co/ibranze/araproje-llama2-7b-hf)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T05:17:54.107073](https://huggingface.co/datasets/open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf/blob/main/results_2023-10-26T05-17-54.107073.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\ \ \"em_stderr\": 0.00036305608931194434,\n \"f1\": 0.055925964765100665,\n\ \ \"f1_stderr\": 0.0013181664771628632,\n \"acc\": 0.4057988012013119,\n\ \ \"acc_stderr\": 0.00970458141675358\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931194434,\n\ \ \"f1\": 0.055925964765100665,\n \"f1_stderr\": 0.0013181664771628632\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \ \ \"acc_stderr\": 0.007086462127954491\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\ \ }\n}\n```" repo_url: https://huggingface.co/ibranze/araproje-llama2-7b-hf leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|arc:challenge|25_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T20-04-34.106747.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T05_17_54.107073 path: - '**/details_harness|drop|3_2023-10-26T05-17-54.107073.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T05-17-54.107073.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T05_17_54.107073 path: - '**/details_harness|gsm8k|5_2023-10-26T05-17-54.107073.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T05-17-54.107073.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hellaswag|10_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-34.106747.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-34.106747.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T20_04_34.106747 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-04-34.106747.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-04-34.106747.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T05_17_54.107073 path: - '**/details_harness|winogrande|5_2023-10-26T05-17-54.107073.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T05-17-54.107073.parquet' - config_name: results data_files: - split: 2023_10_08T20_04_34.106747 path: - results_2023-10-08T20-04-34.106747.parquet - split: 2023_10_26T05_17_54.107073 path: - results_2023-10-26T05-17-54.107073.parquet - split: latest path: - results_2023-10-26T05-17-54.107073.parquet --- # Dataset Card for Evaluation run of ibranze/araproje-llama2-7b-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ibranze/araproje-llama2-7b-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [ibranze/araproje-llama2-7b-hf](https://huggingface.co/ibranze/araproje-llama2-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T05:17:54.107073](https://huggingface.co/datasets/open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf/blob/main/results_2023-10-26T05-17-54.107073.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0012583892617449664, "em_stderr": 0.00036305608931194434, "f1": 0.055925964765100665, "f1_stderr": 0.0013181664771628632, "acc": 0.4057988012013119, "acc_stderr": 0.00970458141675358 }, "harness|drop|3": { "em": 0.0012583892617449664, "em_stderr": 0.00036305608931194434, "f1": 0.055925964765100665, "f1_stderr": 0.0013181664771628632 }, "harness|gsm8k|5": { "acc": 0.0712661106899166, "acc_stderr": 0.007086462127954491 }, "harness|winogrande|5": { "acc": 0.7403314917127072, "acc_stderr": 0.012322700705552667 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,680
[ [ -0.034393310546875, -0.04638671875, 0.0135345458984375, 0.02545166015625, -0.014617919921875, 0.01258087158203125, -0.0267333984375, -0.017242431640625, 0.029998779296875, 0.047027587890625, -0.044677734375, -0.06988525390625, -0.055145263671875, 0.016342163...
open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b
2023-10-28T09:33:25.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T20:23:05
--- pretty_name: Evaluation run of PygmalionAI/pygmalion-2-7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PygmalionAI/pygmalion-2-7b](https://huggingface.co/PygmalionAI/pygmalion-2-7b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T09:33:13.706982](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b/blob/main/results_2023-10-28T09-33-13.706982.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\ \ \"em_stderr\": 0.00034761798968571027,\n \"f1\": 0.05976614932885909,\n\ \ \"f1_stderr\": 0.0013611207374076375,\n \"acc\": 0.4075329125111523,\n\ \ \"acc_stderr\": 0.009436763896104398\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968571027,\n\ \ \"f1\": 0.05976614932885909,\n \"f1_stderr\": 0.0013611207374076375\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06368460955269144,\n \ \ \"acc_stderr\": 0.006726213078805692\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403105\n\ \ }\n}\n```" repo_url: https://huggingface.co/PygmalionAI/pygmalion-2-7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|arc:challenge|25_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T20-22-41.887829.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T09_33_13.706982 path: - '**/details_harness|drop|3_2023-10-28T09-33-13.706982.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T09-33-13.706982.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T09_33_13.706982 path: - '**/details_harness|gsm8k|5_2023-10-28T09-33-13.706982.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T09-33-13.706982.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hellaswag|10_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-22-41.887829.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-22-41.887829.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T20_22_41.887829 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-22-41.887829.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-22-41.887829.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T09_33_13.706982 path: - '**/details_harness|winogrande|5_2023-10-28T09-33-13.706982.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T09-33-13.706982.parquet' - config_name: results data_files: - split: 2023_10_08T20_22_41.887829 path: - results_2023-10-08T20-22-41.887829.parquet - split: 2023_10_28T09_33_13.706982 path: - results_2023-10-28T09-33-13.706982.parquet - split: latest path: - results_2023-10-28T09-33-13.706982.parquet --- # Dataset Card for Evaluation run of PygmalionAI/pygmalion-2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PygmalionAI/pygmalion-2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-2-7b](https://huggingface.co/PygmalionAI/pygmalion-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T09:33:13.706982](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-2-7b/blob/main/results_2023-10-28T09-33-13.706982.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968571027, "f1": 0.05976614932885909, "f1_stderr": 0.0013611207374076375, "acc": 0.4075329125111523, "acc_stderr": 0.009436763896104398 }, "harness|drop|3": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968571027, "f1": 0.05976614932885909, "f1_stderr": 0.0013611207374076375 }, "harness|gsm8k|5": { "acc": 0.06368460955269144, "acc_stderr": 0.006726213078805692 }, "harness|winogrande|5": { "acc": 0.7513812154696132, "acc_stderr": 0.012147314713403105 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,640
[ [ -0.02581787109375, -0.04302978515625, 0.0083465576171875, 0.0179290771484375, -0.01357269287109375, 0.0037326812744140625, -0.034759521484375, -0.0171356201171875, 0.02520751953125, 0.0302886962890625, -0.047943115234375, -0.05877685546875, -0.051544189453125, ...
open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b
2023-10-24T16:53:36.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T20:35:02
--- pretty_name: Evaluation run of zarakiquemparte/zarablend-1.1-l2-7b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [zarakiquemparte/zarablend-1.1-l2-7b](https://huggingface.co/zarakiquemparte/zarablend-1.1-l2-7b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T16:53:24.152575](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b/blob/main/results_2023-10-24T16-53-24.152575.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2829278523489933,\n\ \ \"em_stderr\": 0.0046127395092502785,\n \"f1\": 0.3596476510067135,\n\ \ \"f1_stderr\": 0.004549657562733716,\n \"acc\": 0.38580685542430376,\n\ \ \"acc_stderr\": 0.009136475194671255\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.2829278523489933,\n \"em_stderr\": 0.0046127395092502785,\n\ \ \"f1\": 0.3596476510067135,\n \"f1_stderr\": 0.004549657562733716\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.045489006823351025,\n \ \ \"acc_stderr\": 0.005739657656722217\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.012533292732620292\n\ \ }\n}\n```" repo_url: https://huggingface.co/zarakiquemparte/zarablend-1.1-l2-7b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|arc:challenge|25_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T20-34-38.320909.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T16_53_24.152575 path: - '**/details_harness|drop|3_2023-10-24T16-53-24.152575.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T16-53-24.152575.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T16_53_24.152575 path: - '**/details_harness|gsm8k|5_2023-10-24T16-53-24.152575.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T16-53-24.152575.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hellaswag|10_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-34-38.320909.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-34-38.320909.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T20_34_38.320909 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-34-38.320909.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-34-38.320909.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T16_53_24.152575 path: - '**/details_harness|winogrande|5_2023-10-24T16-53-24.152575.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T16-53-24.152575.parquet' - config_name: results data_files: - split: 2023_10_08T20_34_38.320909 path: - results_2023-10-08T20-34-38.320909.parquet - split: 2023_10_24T16_53_24.152575 path: - results_2023-10-24T16-53-24.152575.parquet - split: latest path: - results_2023-10-24T16-53-24.152575.parquet --- # Dataset Card for Evaluation run of zarakiquemparte/zarablend-1.1-l2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/zarakiquemparte/zarablend-1.1-l2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [zarakiquemparte/zarablend-1.1-l2-7b](https://huggingface.co/zarakiquemparte/zarablend-1.1-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T16:53:24.152575](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarablend-1.1-l2-7b/blob/main/results_2023-10-24T16-53-24.152575.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2829278523489933, "em_stderr": 0.0046127395092502785, "f1": 0.3596476510067135, "f1_stderr": 0.004549657562733716, "acc": 0.38580685542430376, "acc_stderr": 0.009136475194671255 }, "harness|drop|3": { "em": 0.2829278523489933, "em_stderr": 0.0046127395092502785, "f1": 0.3596476510067135, "f1_stderr": 0.004549657562733716 }, "harness|gsm8k|5": { "acc": 0.045489006823351025, "acc_stderr": 0.005739657656722217 }, "harness|winogrande|5": { "acc": 0.7261247040252565, "acc_stderr": 0.012533292732620292 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,732
[ [ -0.03082275390625, -0.047119140625, 0.01180267333984375, 0.019775390625, -0.014434814453125, 0.00434112548828125, -0.027130126953125, -0.01081085205078125, 0.030792236328125, 0.041046142578125, -0.05767822265625, -0.0692138671875, -0.048553466796875, 0.00829...
yangwang825/20newsgroups
2023-10-08T20:39:23.000Z
[ "region:us" ]
yangwang825
null
null
0
0
2023-10-08T20:36:28
Entry not found
15
[ [ -0.0213775634765625, -0.01494598388671875, 0.057159423828125, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052520751953125, 0.005077362060546875, 0.051361083984375, 0.0170135498046875, -0.05206298828125, -0.01494598388671875, -0.06036376953125, 0.03...
open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9
2023-10-28T13:29:27.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T20:45:22
--- pretty_name: Evaluation run of uukuguy/speechless-codellama-34b-v1.9 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [uukuguy/speechless-codellama-34b-v1.9](https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T13:29:15.296218](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9/blob/main/results_2023-10-28T13-29-15.296218.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.29771392617449666,\n\ \ \"em_stderr\": 0.004682699129958643,\n \"f1\": 0.3473626258389263,\n\ \ \"f1_stderr\": 0.004601090689469596,\n \"acc\": 0.4917554915020767,\n\ \ \"acc_stderr\": 0.012144352555904984\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.29771392617449666,\n \"em_stderr\": 0.004682699129958643,\n\ \ \"f1\": 0.3473626258389263,\n \"f1_stderr\": 0.004601090689469596\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24791508718726307,\n \ \ \"acc_stderr\": 0.01189398021482617\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n\ \ }\n}\n```" repo_url: https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|arc:challenge|25_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T20-44-59.061253.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T13_29_15.296218 path: - '**/details_harness|drop|3_2023-10-28T13-29-15.296218.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T13-29-15.296218.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T13_29_15.296218 path: - '**/details_harness|gsm8k|5_2023-10-28T13-29-15.296218.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T13-29-15.296218.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hellaswag|10_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T20_44_59.061253 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-44-59.061253.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-44-59.061253.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T13_29_15.296218 path: - '**/details_harness|winogrande|5_2023-10-28T13-29-15.296218.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T13-29-15.296218.parquet' - config_name: results data_files: - split: 2023_10_08T20_44_59.061253 path: - results_2023-10-08T20-44-59.061253.parquet - split: 2023_10_28T13_29_15.296218 path: - results_2023-10-28T13-29-15.296218.parquet - split: latest path: - results_2023-10-28T13-29-15.296218.parquet --- # Dataset Card for Evaluation run of uukuguy/speechless-codellama-34b-v1.9 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-34b-v1.9](https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T13:29:15.296218](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9/blob/main/results_2023-10-28T13-29-15.296218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.29771392617449666, "em_stderr": 0.004682699129958643, "f1": 0.3473626258389263, "f1_stderr": 0.004601090689469596, "acc": 0.4917554915020767, "acc_stderr": 0.012144352555904984 }, "harness|drop|3": { "em": 0.29771392617449666, "em_stderr": 0.004682699129958643, "f1": 0.3473626258389263, "f1_stderr": 0.004601090689469596 }, "harness|gsm8k|5": { "acc": 0.24791508718726307, "acc_stderr": 0.01189398021482617 }, "harness|winogrande|5": { "acc": 0.7355958958168903, "acc_stderr": 0.012394724896983799 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,750
[ [ -0.0258026123046875, -0.043060302734375, 0.0161285400390625, 0.0229339599609375, -0.01274871826171875, 0.01203155517578125, -0.029937744140625, -0.01229095458984375, 0.0292205810546875, 0.043060302734375, -0.045928955078125, -0.0693359375, -0.04193115234375, ...
open-llm-leaderboard/details_openbmb__UltraRM-13b
2023-10-24T08:14:09.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T20:46:12
--- pretty_name: Evaluation run of openbmb/UltraRM-13b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [openbmb/UltraRM-13b](https://huggingface.co/openbmb/UltraRM-13b) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openbmb__UltraRM-13b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T08:13:56.124311](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraRM-13b/blob/main/results_2023-10-24T08-13-56.124311.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\ em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\ acc\": 0.24664561957379638,\n \"acc_stderr\": 0.0070256103461651745\n \ \ },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\ \ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\ : {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.49329123914759276,\n \"acc_stderr\": 0.014051220692330349\n\ \ }\n}\n```" repo_url: https://huggingface.co/openbmb/UltraRM-13b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|arc:challenge|25_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T20-45-47.827028.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T08_13_56.124311 path: - '**/details_harness|drop|3_2023-10-24T08-13-56.124311.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T08-13-56.124311.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T08_13_56.124311 path: - '**/details_harness|gsm8k|5_2023-10-24T08-13-56.124311.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T08-13-56.124311.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hellaswag|10_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-45-47.827028.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-45-47.827028.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T20_45_47.827028 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-45-47.827028.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-45-47.827028.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T08_13_56.124311 path: - '**/details_harness|winogrande|5_2023-10-24T08-13-56.124311.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T08-13-56.124311.parquet' - config_name: results data_files: - split: 2023_10_08T20_45_47.827028 path: - results_2023-10-08T20-45-47.827028.parquet - split: 2023_10_24T08_13_56.124311 path: - results_2023-10-24T08-13-56.124311.parquet - split: latest path: - results_2023-10-24T08-13-56.124311.parquet --- # Dataset Card for Evaluation run of openbmb/UltraRM-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openbmb/UltraRM-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [openbmb/UltraRM-13b](https://huggingface.co/openbmb/UltraRM-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openbmb__UltraRM-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T08:13:56.124311](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraRM-13b/blob/main/results_2023-10-24T08-13-56.124311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0, "f1_stderr": 0.0, "acc": 0.24664561957379638, "acc_stderr": 0.0070256103461651745 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0, "f1_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.49329123914759276, "acc_stderr": 0.014051220692330349 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,203
[ [ -0.03375244140625, -0.04779052734375, 0.0150146484375, 0.01412200927734375, -0.01432037353515625, 0.00782012939453125, -0.029266357421875, -0.00717926025390625, 0.0211334228515625, 0.042022705078125, -0.050689697265625, -0.07464599609375, -0.038970947265625, ...
open-llm-leaderboard/details_itsliupeng__llama2_7b_code
2023-10-26T11:17:41.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T20:46:51
--- pretty_name: Evaluation run of itsliupeng/llama2_7b_code dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [itsliupeng/llama2_7b_code](https://huggingface.co/itsliupeng/llama2_7b_code)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama2_7b_code\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T11:17:28.829100](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_code/blob/main/results_2023-10-26T11-17-28.829100.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\ \ \"em_stderr\": 0.00031446531194130476,\n \"f1\": 0.05393036912751694,\n\ \ \"f1_stderr\": 0.0012935627430820335,\n \"acc\": 0.3980985212183299,\n\ \ \"acc_stderr\": 0.01010319096153194\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194130476,\n\ \ \"f1\": 0.05393036912751694,\n \"f1_stderr\": 0.0012935627430820335\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08112206216830932,\n \ \ \"acc_stderr\": 0.007520395797922653\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7150749802683505,\n \"acc_stderr\": 0.012685986125141227\n\ \ }\n}\n```" repo_url: https://huggingface.co/itsliupeng/llama2_7b_code leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|arc:challenge|25_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T20-46-27.226805.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T11_17_28.829100 path: - '**/details_harness|drop|3_2023-10-26T11-17-28.829100.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T11-17-28.829100.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T11_17_28.829100 path: - '**/details_harness|gsm8k|5_2023-10-26T11-17-28.829100.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T11-17-28.829100.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hellaswag|10_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-46-27.226805.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T20_46_27.226805 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-46-27.226805.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T20-46-27.226805.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T11_17_28.829100 path: - '**/details_harness|winogrande|5_2023-10-26T11-17-28.829100.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T11-17-28.829100.parquet' - config_name: results data_files: - split: 2023_10_08T20_46_27.226805 path: - results_2023-10-08T20-46-27.226805.parquet - split: 2023_10_26T11_17_28.829100 path: - results_2023-10-26T11-17-28.829100.parquet - split: latest path: - results_2023-10-26T11-17-28.829100.parquet --- # Dataset Card for Evaluation run of itsliupeng/llama2_7b_code ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/itsliupeng/llama2_7b_code - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [itsliupeng/llama2_7b_code](https://huggingface.co/itsliupeng/llama2_7b_code) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama2_7b_code", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T11:17:28.829100](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_code/blob/main/results_2023-10-26T11-17-28.829100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0009437919463087249, "em_stderr": 0.00031446531194130476, "f1": 0.05393036912751694, "f1_stderr": 0.0012935627430820335, "acc": 0.3980985212183299, "acc_stderr": 0.01010319096153194 }, "harness|drop|3": { "em": 0.0009437919463087249, "em_stderr": 0.00031446531194130476, "f1": 0.05393036912751694, "f1_stderr": 0.0012935627430820335 }, "harness|gsm8k|5": { "acc": 0.08112206216830932, "acc_stderr": 0.007520395797922653 }, "harness|winogrande|5": { "acc": 0.7150749802683505, "acc_stderr": 0.012685986125141227 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,630
[ [ -0.02783203125, -0.04095458984375, 0.017547607421875, 0.024993896484375, -0.015625, 0.01093292236328125, -0.02862548828125, -0.01334381103515625, 0.030731201171875, 0.042694091796875, -0.047821044921875, -0.0657958984375, -0.051605224609375, 0.01437377929687...
open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted
2023-10-29T11:23:43.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T21:13:29
--- pretty_name: Evaluation run of Undi95/MLewd-ReMM-L2-Chat-20B-Inverted dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Undi95/MLewd-ReMM-L2-Chat-20B-Inverted](https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B-Inverted)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-29T11:23:30.940403](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted/blob/main/results_2023-10-29T11-23-30.940403.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04079278523489933,\n\ \ \"em_stderr\": 0.0020257579367794474,\n \"f1\": 0.12161703020134187,\n\ \ \"f1_stderr\": 0.002493984929248759,\n \"acc\": 0.4235474125060661,\n\ \ \"acc_stderr\": 0.009995123061460923\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.04079278523489933,\n \"em_stderr\": 0.0020257579367794474,\n\ \ \"f1\": 0.12161703020134187,\n \"f1_stderr\": 0.002493984929248759\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09097801364670205,\n \ \ \"acc_stderr\": 0.007921322844013656\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.01206892327890819\n\ \ }\n}\n```" repo_url: https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B-Inverted leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|arc:challenge|25_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T21-13-04.392733.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_29T11_23_30.940403 path: - '**/details_harness|drop|3_2023-10-29T11-23-30.940403.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-29T11-23-30.940403.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_29T11_23_30.940403 path: - '**/details_harness|gsm8k|5_2023-10-29T11-23-30.940403.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-29T11-23-30.940403.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hellaswag|10_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-13-04.392733.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-13-04.392733.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T21_13_04.392733 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T21-13-04.392733.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T21-13-04.392733.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_29T11_23_30.940403 path: - '**/details_harness|winogrande|5_2023-10-29T11-23-30.940403.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-29T11-23-30.940403.parquet' - config_name: results data_files: - split: 2023_10_08T21_13_04.392733 path: - results_2023-10-08T21-13-04.392733.parquet - split: 2023_10_29T11_23_30.940403 path: - results_2023-10-29T11-23-30.940403.parquet - split: latest path: - results_2023-10-29T11-23-30.940403.parquet --- # Dataset Card for Evaluation run of Undi95/MLewd-ReMM-L2-Chat-20B-Inverted ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B-Inverted - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Undi95/MLewd-ReMM-L2-Chat-20B-Inverted](https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B-Inverted) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T11:23:30.940403](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted/blob/main/results_2023-10-29T11-23-30.940403.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.04079278523489933, "em_stderr": 0.0020257579367794474, "f1": 0.12161703020134187, "f1_stderr": 0.002493984929248759, "acc": 0.4235474125060661, "acc_stderr": 0.009995123061460923 }, "harness|drop|3": { "em": 0.04079278523489933, "em_stderr": 0.0020257579367794474, "f1": 0.12161703020134187, "f1_stderr": 0.002493984929248759 }, "harness|gsm8k|5": { "acc": 0.09097801364670205, "acc_stderr": 0.007921322844013656 }, "harness|winogrande|5": { "acc": 0.7561168113654302, "acc_stderr": 0.01206892327890819 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,770
[ [ -0.027801513671875, -0.0550537109375, 0.009002685546875, 0.0180206298828125, -0.01332855224609375, 0.0127410888671875, -0.0294189453125, -0.0153045654296875, 0.035919189453125, 0.0460205078125, -0.060028076171875, -0.0662841796875, -0.047149658203125, 0.0103...
open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b
2023-10-24T15:12:13.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T21:18:38
--- pretty_name: Evaluation run of pankajmathur/Lima_Unchained_70b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [pankajmathur/Lima_Unchained_70b](https://huggingface.co/pankajmathur/Lima_Unchained_70b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-24T15:12:00.885313](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b/blob/main/results_2023-10-24T15-12-00.885313.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08095637583892618,\n\ \ \"em_stderr\": 0.0027934007378494835,\n \"f1\": 0.14366401006711405,\n\ \ \"f1_stderr\": 0.0029514013565745323,\n \"acc\": 0.591927346839615,\n\ \ \"acc_stderr\": 0.011752297176210316\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.08095637583892618,\n \"em_stderr\": 0.0027934007378494835,\n\ \ \"f1\": 0.14366401006711405,\n \"f1_stderr\": 0.0029514013565745323\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34723275208491283,\n \ \ \"acc_stderr\": 0.01311389838214687\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.01039069597027376\n\ \ }\n}\n```" repo_url: https://huggingface.co/pankajmathur/Lima_Unchained_70b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|arc:challenge|25_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T21-18-19.268295.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_24T15_12_00.885313 path: - '**/details_harness|drop|3_2023-10-24T15-12-00.885313.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-24T15-12-00.885313.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_24T15_12_00.885313 path: - '**/details_harness|gsm8k|5_2023-10-24T15-12-00.885313.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-24T15-12-00.885313.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hellaswag|10_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-18-19.268295.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-18-19.268295.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T21_18_19.268295 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T21-18-19.268295.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T21-18-19.268295.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_24T15_12_00.885313 path: - '**/details_harness|winogrande|5_2023-10-24T15-12-00.885313.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-24T15-12-00.885313.parquet' - config_name: results data_files: - split: 2023_10_08T21_18_19.268295 path: - results_2023-10-08T21-18-19.268295.parquet - split: 2023_10_24T15_12_00.885313 path: - results_2023-10-24T15-12-00.885313.parquet - split: latest path: - results_2023-10-24T15-12-00.885313.parquet --- # Dataset Card for Evaluation run of pankajmathur/Lima_Unchained_70b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/pankajmathur/Lima_Unchained_70b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [pankajmathur/Lima_Unchained_70b](https://huggingface.co/pankajmathur/Lima_Unchained_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T15:12:00.885313](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__Lima_Unchained_70b/blob/main/results_2023-10-24T15-12-00.885313.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.08095637583892618, "em_stderr": 0.0027934007378494835, "f1": 0.14366401006711405, "f1_stderr": 0.0029514013565745323, "acc": 0.591927346839615, "acc_stderr": 0.011752297176210316 }, "harness|drop|3": { "em": 0.08095637583892618, "em_stderr": 0.0027934007378494835, "f1": 0.14366401006711405, "f1_stderr": 0.0029514013565745323 }, "harness|gsm8k|5": { "acc": 0.34723275208491283, "acc_stderr": 0.01311389838214687 }, "harness|winogrande|5": { "acc": 0.8366219415943172, "acc_stderr": 0.01039069597027376 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,686
[ [ -0.032073974609375, -0.050750732421875, 0.0108489990234375, 0.01922607421875, -0.0216522216796875, 0.01087188720703125, -0.029205322265625, -0.0116424560546875, 0.031036376953125, 0.047576904296875, -0.048309326171875, -0.06866455078125, -0.051971435546875, ...
Monkaro/1024x1024-Woman
2023-10-08T21:59:06.000Z
[ "region:us" ]
Monkaro
null
null
0
0
2023-10-08T21:46:30
Entry not found
15
[ [ -0.021392822265625, -0.01494598388671875, 0.05718994140625, 0.028839111328125, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.01702880859375, -0.052093505859375, -0.01494598388671875, -0.06036376953125, 0.03790...
open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0
2023-10-23T15:36:00.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T21:56:02
--- pretty_name: Evaluation run of uukuguy/speechless-codellama-34b-v2.0 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [uukuguy/speechless-codellama-34b-v2.0](https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-23T15:35:47.826162](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0/blob/main/results_2023-10-23T15-35-47.826162.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3704907718120805,\n\ \ \"em_stderr\": 0.004945718565106882,\n \"f1\": 0.4170574664429539,\n\ \ \"f1_stderr\": 0.004815998685057963,\n \"acc\": 0.42579643160821773,\n\ \ \"acc_stderr\": 0.010607605194213141\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.3704907718120805,\n \"em_stderr\": 0.004945718565106882,\n\ \ \"f1\": 0.4170574664429539,\n \"f1_stderr\": 0.004815998685057963\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11599696739954511,\n \ \ \"acc_stderr\": 0.008820485491442485\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n\ \ }\n}\n```" repo_url: https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|arc:challenge|25_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T21-55-38.209151.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_23T15_35_47.826162 path: - '**/details_harness|drop|3_2023-10-23T15-35-47.826162.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-23T15-35-47.826162.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_23T15_35_47.826162 path: - '**/details_harness|gsm8k|5_2023-10-23T15-35-47.826162.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-23T15-35-47.826162.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hellaswag|10_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-55-38.209151.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T21-55-38.209151.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T21_55_38.209151 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T21-55-38.209151.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T21-55-38.209151.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_23T15_35_47.826162 path: - '**/details_harness|winogrande|5_2023-10-23T15-35-47.826162.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-23T15-35-47.826162.parquet' - config_name: results data_files: - split: 2023_10_08T21_55_38.209151 path: - results_2023-10-08T21-55-38.209151.parquet - split: 2023_10_23T15_35_47.826162 path: - results_2023-10-23T15-35-47.826162.parquet - split: latest path: - results_2023-10-23T15-35-47.826162.parquet --- # Dataset Card for Evaluation run of uukuguy/speechless-codellama-34b-v2.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-34b-v2.0](https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T15:35:47.826162](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v2.0/blob/main/results_2023-10-23T15-35-47.826162.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3704907718120805, "em_stderr": 0.004945718565106882, "f1": 0.4170574664429539, "f1_stderr": 0.004815998685057963, "acc": 0.42579643160821773, "acc_stderr": 0.010607605194213141 }, "harness|drop|3": { "em": 0.3704907718120805, "em_stderr": 0.004945718565106882, "f1": 0.4170574664429539, "f1_stderr": 0.004815998685057963 }, "harness|gsm8k|5": { "acc": 0.11599696739954511, "acc_stderr": 0.008820485491442485 }, "harness|winogrande|5": { "acc": 0.7355958958168903, "acc_stderr": 0.012394724896983799 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,750
[ [ -0.025299072265625, -0.0438232421875, 0.0165252685546875, 0.0235443115234375, -0.012664794921875, 0.0118865966796875, -0.030609130859375, -0.01369476318359375, 0.02886962890625, 0.0426025390625, -0.044189453125, -0.06842041015625, -0.042083740234375, 0.00703...
munyamakosa/creditscoring
2023-10-08T22:22:44.000Z
[ "region:us" ]
munyamakosa
null
null
0
0
2023-10-08T22:22:44
Entry not found
15
[ [ -0.021392822265625, -0.01494598388671875, 0.05718994140625, 0.028839111328125, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.01702880859375, -0.052093505859375, -0.01494598388671875, -0.06036376953125, 0.03790...
Joisas/s
2023-10-08T22:49:53.000Z
[ "region:us" ]
Joisas
null
null
0
0
2023-10-08T22:24:52
Entry not found
15
[ [ -0.021392822265625, -0.01494598388671875, 0.05718994140625, 0.028839111328125, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.01702880859375, -0.052093505859375, -0.01494598388671875, -0.06036376953125, 0.03790...
open-llm-leaderboard/details_Aeala__Alpaca-elina-65b
2023-10-08T22:34:40.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T22:34:32
--- pretty_name: Evaluation run of Aeala/Alpaca-elina-65b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Aeala/Alpaca-elina-65b](https://huggingface.co/Aeala/Alpaca-elina-65b) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 3 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aeala__Alpaca-elina-65b\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-08T22:34:28.379829](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__Alpaca-elina-65b/blob/main/results_2023-10-08T22-34-28.379829.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.27736996644295303,\n\ \ \"em_stderr\": 0.004584873651869028,\n \"f1\": 0.33694211409395997,\n\ \ \"f1_stderr\": 0.004497646539610947,\n \"acc\": 0.5520523608267965,\n\ \ \"acc_stderr\": 0.011722735218747584\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.27736996644295303,\n \"em_stderr\": 0.004584873651869028,\n\ \ \"f1\": 0.33694211409395997,\n \"f1_stderr\": 0.004497646539610947\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.29037149355572406,\n \ \ \"acc_stderr\": 0.012503592481818955\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676211\n\ \ }\n}\n```" repo_url: https://huggingface.co/Aeala/Alpaca-elina-65b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_drop_3 data_files: - split: 2023_10_08T22_34_28.379829 path: - '**/details_harness|drop|3_2023-10-08T22-34-28.379829.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-08T22-34-28.379829.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_08T22_34_28.379829 path: - '**/details_harness|gsm8k|5_2023-10-08T22-34-28.379829.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-08T22-34-28.379829.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_08T22_34_28.379829 path: - '**/details_harness|winogrande|5_2023-10-08T22-34-28.379829.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-08T22-34-28.379829.parquet' - config_name: results data_files: - split: 2023_10_08T22_34_28.379829 path: - results_2023-10-08T22-34-28.379829.parquet - split: latest path: - results_2023-10-08T22-34-28.379829.parquet --- # Dataset Card for Evaluation run of Aeala/Alpaca-elina-65b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aeala/Alpaca-elina-65b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Aeala/Alpaca-elina-65b](https://huggingface.co/Aeala/Alpaca-elina-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aeala__Alpaca-elina-65b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-08T22:34:28.379829](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__Alpaca-elina-65b/blob/main/results_2023-10-08T22-34-28.379829.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.27736996644295303, "em_stderr": 0.004584873651869028, "f1": 0.33694211409395997, "f1_stderr": 0.004497646539610947, "acc": 0.5520523608267965, "acc_stderr": 0.011722735218747584 }, "harness|drop|3": { "em": 0.27736996644295303, "em_stderr": 0.004584873651869028, "f1": 0.33694211409395997, "f1_stderr": 0.004497646539610947 }, "harness|gsm8k|5": { "acc": 0.29037149355572406, "acc_stderr": 0.012503592481818955 }, "harness|winogrande|5": { "acc": 0.813733228097869, "acc_stderr": 0.010941877955676211 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
7,165
[ [ -0.03240966796875, -0.05767822265625, 0.011138916015625, 0.0183563232421875, -0.010345458984375, 0.0024852752685546875, -0.018310546875, -0.0192718505859375, 0.038116455078125, 0.036163330078125, -0.04541015625, -0.06787109375, -0.05291748046875, 0.017471313...
open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy
2023-10-08T23:28:24.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-08T23:27:25
--- pretty_name: Evaluation run of u-chom/preded-title-amazongoogle-abtbuy dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [u-chom/preded-title-amazongoogle-abtbuy](https://huggingface.co/u-chom/preded-title-amazongoogle-abtbuy)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-08T23:27:01.372351](https://huggingface.co/datasets/open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy/blob/main/results_2023-10-08T23-27-01.372351.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38488561065128146,\n\ \ \"acc_stderr\": 0.03460083230388379,\n \"acc_norm\": 0.38889827098513907,\n\ \ \"acc_norm_stderr\": 0.034587970268505575,\n \"mc1\": 0.2692778457772338,\n\ \ \"mc1_stderr\": 0.015528566637087281,\n \"mc2\": 0.4164930056701617,\n\ \ \"mc2_stderr\": 0.013916947335276144\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.4667235494880546,\n \"acc_stderr\": 0.014578995859605808,\n\ \ \"acc_norm\": 0.5093856655290102,\n \"acc_norm_stderr\": 0.014608816322065\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5873332005576578,\n\ \ \"acc_stderr\": 0.00491307684443376,\n \"acc_norm\": 0.7814180442143,\n\ \ \"acc_norm_stderr\": 0.004124396294659584\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\ \ \"acc_stderr\": 0.042667634040995814,\n \"acc_norm\": 0.4222222222222222,\n\ \ \"acc_norm_stderr\": 0.042667634040995814\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.038781398887976104,\n\ \ \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.038781398887976104\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.3584905660377358,\n \"acc_stderr\": 0.02951470358398177,\n\ \ \"acc_norm\": 0.3584905660377358,\n \"acc_norm_stderr\": 0.02951470358398177\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3888888888888889,\n\ \ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.3888888888888889,\n\ \ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\ : 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\ \ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\ \ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\ \ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\ \ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.03996629574876719,\n\ \ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.03996629574876719\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.21164021164021163,\n \"acc_stderr\": 0.021037331505262883,\n \"\ acc_norm\": 0.21164021164021163,\n \"acc_norm_stderr\": 0.021037331505262883\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\ \ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\ \ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.35161290322580646,\n \"acc_stderr\": 0.027162537826948458,\n \"\ acc_norm\": 0.35161290322580646,\n \"acc_norm_stderr\": 0.027162537826948458\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n \"\ acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\ : 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.4484848484848485,\n \"acc_stderr\": 0.038835659779569286,\n\ \ \"acc_norm\": 0.4484848484848485,\n \"acc_norm_stderr\": 0.038835659779569286\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.40404040404040403,\n \"acc_stderr\": 0.03496130972056127,\n \"\ acc_norm\": 0.40404040404040403,\n \"acc_norm_stderr\": 0.03496130972056127\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.5492227979274611,\n \"acc_stderr\": 0.03590910952235524,\n\ \ \"acc_norm\": 0.5492227979274611,\n \"acc_norm_stderr\": 0.03590910952235524\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.32564102564102565,\n \"acc_stderr\": 0.02375966576741229,\n\ \ \"acc_norm\": 0.32564102564102565,\n \"acc_norm_stderr\": 0.02375966576741229\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \ \ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121626,\n\ \ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121626\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"\ acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.43853211009174314,\n \"acc_stderr\": 0.021274713073954562,\n \"\ acc_norm\": 0.43853211009174314,\n \"acc_norm_stderr\": 0.021274713073954562\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.1712962962962963,\n \"acc_stderr\": 0.025695341643824685,\n \"\ acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.025695341643824685\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.4019607843137255,\n \"acc_stderr\": 0.034411900234824655,\n \"\ acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.034411900234824655\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.3881856540084388,\n \"acc_stderr\": 0.0317229500433233,\n \ \ \"acc_norm\": 0.3881856540084388,\n \"acc_norm_stderr\": 0.0317229500433233\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n\ \ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n\ \ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\ \ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.512396694214876,\n \"acc_stderr\": 0.045629515481807666,\n \"\ acc_norm\": 0.512396694214876,\n \"acc_norm_stderr\": 0.045629515481807666\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n\ \ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n\ \ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.038566721635489125,\n\ \ \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.038566721635489125\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\ \ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\ \ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.4077669902912621,\n \"acc_stderr\": 0.048657775704107675,\n\ \ \"acc_norm\": 0.4077669902912621,\n \"acc_norm_stderr\": 0.048657775704107675\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\ \ \"acc_stderr\": 0.03150712523091264,\n \"acc_norm\": 0.6367521367521367,\n\ \ \"acc_norm_stderr\": 0.03150712523091264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\ : {\n \"acc\": 0.5491698595146871,\n \"acc_stderr\": 0.017793297572699034,\n\ \ \"acc_norm\": 0.5491698595146871,\n \"acc_norm_stderr\": 0.017793297572699034\n\ \ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.42196531791907516,\n\ \ \"acc_stderr\": 0.02658923114217426,\n \"acc_norm\": 0.42196531791907516,\n\ \ \"acc_norm_stderr\": 0.02658923114217426\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\ : {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n\ \ \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n\ \ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.40522875816993464,\n\ \ \"acc_stderr\": 0.02811092849280907,\n \"acc_norm\": 0.40522875816993464,\n\ \ \"acc_norm_stderr\": 0.02811092849280907\n },\n \"harness|hendrycksTest-philosophy|5\"\ : {\n \"acc\": 0.5080385852090032,\n \"acc_stderr\": 0.028394421370984538,\n\ \ \"acc_norm\": 0.5080385852090032,\n \"acc_norm_stderr\": 0.028394421370984538\n\ \ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.43209876543209874,\n\ \ \"acc_stderr\": 0.027563010971606676,\n \"acc_norm\": 0.43209876543209874,\n\ \ \"acc_norm_stderr\": 0.027563010971606676\n },\n \"harness|hendrycksTest-professional_accounting|5\"\ : {\n \"acc\": 0.30141843971631205,\n \"acc_stderr\": 0.02737412888263115,\n\ \ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.02737412888263115\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33572359843546284,\n\ \ \"acc_stderr\": 0.012061304157664607,\n \"acc_norm\": 0.33572359843546284,\n\ \ \"acc_norm_stderr\": 0.012061304157664607\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n\ \ \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.4035947712418301,\n \"acc_stderr\": 0.019848280168401154,\n \ \ \"acc_norm\": 0.4035947712418301,\n \"acc_norm_stderr\": 0.019848280168401154\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\ \ \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n\ \ \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.2653061224489796,\n \"acc_stderr\": 0.028263889943784606,\n\ \ \"acc_norm\": 0.2653061224489796,\n \"acc_norm_stderr\": 0.028263889943784606\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n\ \ \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.5572139303482587,\n\ \ \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n\ \ \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6140350877192983,\n\ \ \"acc_stderr\": 0.03733756969066165,\n \"acc_norm\": 0.6140350877192983,\n\ \ \"acc_norm_stderr\": 0.03733756969066165\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.2692778457772338,\n \"mc1_stderr\": 0.015528566637087281,\n\ \ \"mc2\": 0.4164930056701617,\n \"mc2_stderr\": 0.013916947335276144\n\ \ }\n}\n```" repo_url: https://huggingface.co/u-chom/preded-title-amazongoogle-abtbuy leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|arc:challenge|25_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hellaswag|10_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-08T23-27-01.372351.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T23-27-01.372351.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_08T23_27_01.372351 path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T23-27-01.372351.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-08T23-27-01.372351.parquet' - config_name: results data_files: - split: 2023_10_08T23_27_01.372351 path: - results_2023-10-08T23-27-01.372351.parquet - split: latest path: - results_2023-10-08T23-27-01.372351.parquet --- # Dataset Card for Evaluation run of u-chom/preded-title-amazongoogle-abtbuy ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/u-chom/preded-title-amazongoogle-abtbuy - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [u-chom/preded-title-amazongoogle-abtbuy](https://huggingface.co/u-chom/preded-title-amazongoogle-abtbuy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-08T23:27:01.372351](https://huggingface.co/datasets/open-llm-leaderboard/details_u-chom__preded-title-amazongoogle-abtbuy/blob/main/results_2023-10-08T23-27-01.372351.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.38488561065128146, "acc_stderr": 0.03460083230388379, "acc_norm": 0.38889827098513907, "acc_norm_stderr": 0.034587970268505575, "mc1": 0.2692778457772338, "mc1_stderr": 0.015528566637087281, "mc2": 0.4164930056701617, "mc2_stderr": 0.013916947335276144 }, "harness|arc:challenge|25": { "acc": 0.4667235494880546, "acc_stderr": 0.014578995859605808, "acc_norm": 0.5093856655290102, "acc_norm_stderr": 0.014608816322065 }, "harness|hellaswag|10": { "acc": 0.5873332005576578, "acc_stderr": 0.00491307684443376, "acc_norm": 0.7814180442143, "acc_norm_stderr": 0.004124396294659584 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4222222222222222, "acc_stderr": 0.042667634040995814, "acc_norm": 0.4222222222222222, "acc_norm_stderr": 0.042667634040995814 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.34868421052631576, "acc_stderr": 0.038781398887976104, "acc_norm": 0.34868421052631576, "acc_norm_stderr": 0.038781398887976104 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.3584905660377358, "acc_stderr": 0.02951470358398177, "acc_norm": 0.3584905660377358, "acc_norm_stderr": 0.02951470358398177 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04076663253918567, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04076663253918567 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3236994219653179, "acc_stderr": 0.0356760379963917, "acc_norm": 0.3236994219653179, "acc_norm_stderr": 0.0356760379963917 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617747, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617747 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3829787234042553, "acc_stderr": 0.03177821250236922, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.03177821250236922 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.040493392977481425, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.040493392977481425 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3586206896551724, "acc_stderr": 0.03996629574876719, "acc_norm": 0.3586206896551724, "acc_norm_stderr": 0.03996629574876719 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.21164021164021163, "acc_stderr": 0.021037331505262883, "acc_norm": 0.21164021164021163, "acc_norm_stderr": 0.021037331505262883 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.31746031746031744, "acc_stderr": 0.04163453031302859, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.04163453031302859 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.35161290322580646, "acc_stderr": 0.027162537826948458, "acc_norm": 0.35161290322580646, "acc_norm_stderr": 0.027162537826948458 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2413793103448276, "acc_stderr": 0.030108330718011625, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.030108330718011625 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.4484848484848485, "acc_stderr": 0.038835659779569286, "acc_norm": 0.4484848484848485, "acc_norm_stderr": 0.038835659779569286 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.40404040404040403, "acc_stderr": 0.03496130972056127, "acc_norm": 0.40404040404040403, "acc_norm_stderr": 0.03496130972056127 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.5492227979274611, "acc_stderr": 0.03590910952235524, "acc_norm": 0.5492227979274611, "acc_norm_stderr": 0.03590910952235524 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.32564102564102565, "acc_stderr": 0.02375966576741229, "acc_norm": 0.32564102564102565, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.21851851851851853, "acc_stderr": 0.02519575225182379, "acc_norm": 0.21851851851851853, "acc_norm_stderr": 0.02519575225182379 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3445378151260504, "acc_stderr": 0.030868682604121626, "acc_norm": 0.3445378151260504, "acc_norm_stderr": 0.030868682604121626 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2251655629139073, "acc_stderr": 0.03410435282008937, "acc_norm": 0.2251655629139073, "acc_norm_stderr": 0.03410435282008937 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.43853211009174314, "acc_stderr": 0.021274713073954562, "acc_norm": 0.43853211009174314, "acc_norm_stderr": 0.021274713073954562 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1712962962962963, "acc_stderr": 0.025695341643824685, "acc_norm": 0.1712962962962963, "acc_norm_stderr": 0.025695341643824685 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4019607843137255, "acc_stderr": 0.034411900234824655, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.034411900234824655 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.3881856540084388, "acc_stderr": 0.0317229500433233, "acc_norm": 0.3881856540084388, "acc_norm_stderr": 0.0317229500433233 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4798206278026906, "acc_stderr": 0.033530461674123, "acc_norm": 0.4798206278026906, "acc_norm_stderr": 0.033530461674123 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.4198473282442748, "acc_stderr": 0.04328577215262972, "acc_norm": 0.4198473282442748, "acc_norm_stderr": 0.04328577215262972 }, "harness|hendrycksTest-international_law|5": { "acc": 0.512396694214876, "acc_stderr": 0.045629515481807666, "acc_norm": 0.512396694214876, "acc_norm_stderr": 0.045629515481807666 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04803752235190193, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04803752235190193 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.4049079754601227, "acc_stderr": 0.038566721635489125, "acc_norm": 0.4049079754601227, "acc_norm_stderr": 0.038566721635489125 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.04521829902833586, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.04521829902833586 }, "harness|hendrycksTest-management|5": { "acc": 0.4077669902912621, "acc_stderr": 0.048657775704107675, "acc_norm": 0.4077669902912621, "acc_norm_stderr": 0.048657775704107675 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6367521367521367, "acc_stderr": 0.03150712523091264, "acc_norm": 0.6367521367521367, "acc_norm_stderr": 0.03150712523091264 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.5491698595146871, "acc_stderr": 0.017793297572699034, "acc_norm": 0.5491698595146871, "acc_norm_stderr": 0.017793297572699034 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.42196531791907516, "acc_stderr": 0.02658923114217426, "acc_norm": 0.42196531791907516, "acc_norm_stderr": 0.02658923114217426 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.40522875816993464, "acc_stderr": 0.02811092849280907, "acc_norm": 0.40522875816993464, "acc_norm_stderr": 0.02811092849280907 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5080385852090032, "acc_stderr": 0.028394421370984538, "acc_norm": 0.5080385852090032, "acc_norm_stderr": 0.028394421370984538 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.43209876543209874, "acc_stderr": 0.027563010971606676, "acc_norm": 0.43209876543209874, "acc_norm_stderr": 0.027563010971606676 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.30141843971631205, "acc_stderr": 0.02737412888263115, "acc_norm": 0.30141843971631205, "acc_norm_stderr": 0.02737412888263115 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.33572359843546284, "acc_stderr": 0.012061304157664607, "acc_norm": 0.33572359843546284, "acc_norm_stderr": 0.012061304157664607 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.24632352941176472, "acc_stderr": 0.02617343857052, "acc_norm": 0.24632352941176472, "acc_norm_stderr": 0.02617343857052 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4035947712418301, "acc_stderr": 0.019848280168401154, "acc_norm": 0.4035947712418301, "acc_norm_stderr": 0.019848280168401154 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.4636363636363636, "acc_stderr": 0.047764491623961985, "acc_norm": 0.4636363636363636, "acc_norm_stderr": 0.047764491623961985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2653061224489796, "acc_stderr": 0.028263889943784606, "acc_norm": 0.2653061224489796, "acc_norm_stderr": 0.028263889943784606 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5572139303482587, "acc_stderr": 0.03512310964123937, "acc_norm": 0.5572139303482587, "acc_norm_stderr": 0.03512310964123937 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-virology|5": { "acc": 0.3855421686746988, "acc_stderr": 0.037891344246115496, "acc_norm": 0.3855421686746988, "acc_norm_stderr": 0.037891344246115496 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6140350877192983, "acc_stderr": 0.03733756969066165, "acc_norm": 0.6140350877192983, "acc_norm_stderr": 0.03733756969066165 }, "harness|truthfulqa:mc|0": { "mc1": 0.2692778457772338, "mc1_stderr": 0.015528566637087281, "mc2": 0.4164930056701617, "mc2_stderr": 0.013916947335276144 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
65,020
[ [ -0.048370361328125, -0.0577392578125, 0.0191650390625, 0.0130157470703125, -0.01300811767578125, -0.001987457275390625, 0.0009870529174804688, -0.0148468017578125, 0.039276123046875, -0.0009331703186035156, -0.031768798828125, -0.04876708984375, -0.0296478271484...
ai4ce/OCFBench
2023-11-02T15:12:13.000Z
[ "size_categories:10K<n<100K", "language:en", "license:cc-by-nc-sa-4.0", "arxiv:2310.11239", "region:us" ]
ai4ce
null
null
0
0
2023-10-08T23:32:22
--- license: cc-by-nc-sa-4.0 language: - en pretty_name: OCFBench size_categories: - 10K<n<100K --- # Dataset Card for OCFBench [[Paper]](https://arxiv.org/abs/2310.11239) [[Code]](https://github.com/ai4ce/Occ4cast/) [[Website]](https://ai4ce.github.io/Occ4cast/) <!-- Provide a quick summary of the dataset. --> The OCFBench dataset is curated in the paper [**Occ4cast: LiDAR-based 4D Occupancy Completion and Forecasting**](https://arxiv.org/abs/2310.11239). The dataset is processed from public autonomous driving data to support the training and evaluation of the novel **occupancy completion and forecasting (OCF)** task. # Uses Please download each `.sqf` file from individual directories and mount them to local system for usage. For larger files that are splited into several parts, please run the following code to merge the parts before mounting: ``` cat output_prefix_* > merged.sqf ``` Please refer to our [GitHub repository](https://github.com/ai4ce/Occ4cast/) for dataset structure and loading details. ## Citation <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** ``` @article{Liu2023occ4cast, title={LiDAR-based 4D Occupancy Completion and Forecasting}, author={Xinhao Liu and Moonjun Gong and Qi Fang and Haoyu Xie and Yiming Li and Hang Zhao and Chen Feng}, journal={arXiv preprint arXiv:2310.11239}, year={2023} } ```
1,468
[ [ -0.054962158203125, -0.0187225341796875, 0.046844482421875, -0.015899658203125, -0.0066375732421875, -0.0173492431640625, 0.002017974853515625, -0.050628662109375, -0.0021610260009765625, 0.0282135009765625, -0.0297393798828125, -0.0526123046875, -0.014083862304...
neuroback/DataBack
2023-10-27T21:13:12.000Z
[ "arxiv:2110.14053", "region:us" ]
neuroback
null
null
1
0
2023-10-08T23:35:37
# DataBack: Dataset of SAT Formulas and Backbone Variable Phases ## What is DataBack `DataBack` is a dataset that consists of 120,286 SAT formulas (in CNF format), each labeled with the phases of its backbone variables. `DataBack` contains two distinct subsets: the pre-training set, named `DataBack-PT`, and the fine-tuning set, named `DataBack-FT`, for pre-training and fine-tuning our `NeuroBack` model, respectively. To learn more about `NeuroBack` and `DataBack`, please refer to our [`NeuroBack paper`](https://arxiv.org/pdf/2110.14053.pdf). The state-of-the-art backbone extractor, [`CadiBack`](https://github.com/arminbiere/cadiback), has been employed to extract the backbone variable phases. To learn more about `CadiBack`, please refer to the [`CadiBack paper`](https://wenxiwang.github.io/papers/cadiback.pdf). ## Directory Structure ``` |- original # Original CNF formulas and their backbone variable phases | |- cnf_pt.tar.gz # CNF formulas for pre-training | |- bb_pt.tar.gz # Backbone phases for pre-training formulas | |- cnf_ft.tar.gz # CNF formulas for fine-tuning | |- bb_ft.tar.gz # Backbone phases for fine-tuning formulas | |- dual # Dual CNF formulas and their backbone variable phases | |- d_cnf_pt.tar.gz # Dual CNF formulas for pre-training | |- d_bb_pt.tar.gz # Backbone phases for dual pre-training formulas | |- d_cnf_ft.tar.gz # Dual CNF formulas for fine-tuning | |- d_bb_ft.tar.gz # Backbone phases for dual fine-tuning formulas ``` ## File Naming Convention In the original directory, each CNF tar file (**`cnf_*.tar.gz`**) contains compressed CNF files named: **`[cnf_name].[compression_format]`**, where **`[compression_format]`** could be bz2, lzma, xz, gz, etc. Correspondingly, each backbone tar file (**`bb_*.tar.gz`**) comprises compressed backbone files named: **`[cnf_name].backbone.xz`**. It is important to note that a compressed CNF file will always share its **`[cnf_name]`** with its associated compressed backbone file. For dual formulas and their corresponding backbone files, the naming convention remains consistent, but with an added **`d_`** prefix. ## Format of the Extracted Backbone File The extracted backbone file (`*.backbone`) adheres to the output format of [`CadiBack`](https://github.com/arminbiere/cadiback). ## References If you use `DataBack` in your research, please kindly cite the following papers. [`NeuroBack paper`](https://arxiv.org/pdf/2110.14053.pdf): ```bib @article{wang2023neuroback, author = {Wang, Wenxi and Hu, Yang and Tiwari, Mohit and Khurshid, Sarfraz and McMillan, Kenneth L. and Miikkulainen, Risto}, title = {NeuroBack: Improving CDCL SAT Solving using Graph Neural Networks}, journal={arXiv preprint arXiv:2110.14053}, year={2021} } ``` [`CadiBack paper`](https://wenxiwang.github.io/papers/cadiback.pdf): ```bib @inproceedings{biere2023cadiback, title={CadiBack: Extracting Backbones with CaDiCaL}, author={Biere, Armin and Froleyks, Nils and Wang, Wenxi}, booktitle={26th International Conference on Theory and Applications of Satisfiability Testing (SAT 2023)}, year={2023}, organization={Schloss Dagstuhl-Leibniz-Zentrum f{\"u}r Informatik} } ``` ## Contributors Wenxi Wang (wenxiw@utexas.edu), Yang Hu (huyang@utexas.edu)
3,386
[ [ -0.02099609375, -0.0270843505859375, 0.015411376953125, 0.04168701171875, -0.0136566162109375, 0.0021915435791015625, -0.0087738037109375, -0.006816864013671875, 0.03497314453125, 0.051727294921875, -0.053070068359375, -0.05767822265625, -0.0253143310546875, ...
benxh/us-library-of-congress-subjects
2023-10-08T23:38:01.000Z
[ "region:us" ]
benxh
null
null
0
0
2023-10-08T23:36:22
Just all the subjects from the US Library of Congress cleaned up into JSONL's, missing metadata. Grab the latest here: https://id.loc.gov/authorities/subjects.html
164
[ [ -0.0309906005859375, -0.0628662109375, 0.060577392578125, -0.0239715576171875, -0.0244293212890625, 0.042236328125, 0.0097503662109375, -0.005435943603515625, 0.03448486328125, 0.0946044921875, -0.03826904296875, -0.05535888671875, -0.0286407470703125, 0.002...
Zerenidel/DG_Simple
2023-10-09T03:16:36.000Z
[ "region:us" ]
Zerenidel
null
null
0
0
2023-10-09T00:28:16
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
Ar4ikov/celebA_spoof
2023-10-09T03:25:23.000Z
[ "region:us" ]
Ar4ikov
null
null
1
0
2023-10-09T01:02:20
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: valid path: data/valid-* - split: test path: data/test-* dataset_info: features: - name: Filepath dtype: image - name: Bbox sequence: int64 - name: Class dtype: string splits: - name: train num_bytes: 46432284811.335 num_examples: 419935 - name: valid num_bytes: 4163631829.316 num_examples: 46738 - name: test num_bytes: 32416692607.675 num_examples: 59191 download_size: 72011056582 dataset_size: 83012609248.326 --- # Dataset Card for "celebA_spoof" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
752
[ [ -0.028961181640625, -0.048614501953125, 0.005275726318359375, 0.0246124267578125, -0.00498199462890625, 0.003688812255859375, 0.01456451416015625, -0.021942138671875, 0.085693359375, 0.0394287109375, -0.043182373046875, -0.040618896484375, -0.04718017578125, ...
Monkaro/Starsmitten
2023-10-09T01:04:02.000Z
[ "region:us" ]
Monkaro
null
null
0
0
2023-10-09T01:02:37
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1
2023-10-28T23:37:48.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-09T01:34:51
--- pretty_name: Evaluation run of s1ghhh/medllama-2-70b-qlora-1.1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [s1ghhh/medllama-2-70b-qlora-1.1](https://huggingface.co/s1ghhh/medllama-2-70b-qlora-1.1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-28T23:37:36.261412](https://huggingface.co/datasets/open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1/blob/main/results_2023-10-28T23-37-36.261412.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4476719798657718,\n\ \ \"em_stderr\": 0.005092348829658167,\n \"f1\": 0.49099203020134397,\n\ \ \"f1_stderr\": 0.004914477006067904,\n \"acc\": 0.5814221507886975,\n\ \ \"acc_stderr\": 0.011551816841221033\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.4476719798657718,\n \"em_stderr\": 0.005092348829658167,\n\ \ \"f1\": 0.49099203020134397,\n \"f1_stderr\": 0.004914477006067904\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3206974981046247,\n \ \ \"acc_stderr\": 0.012856468433722304\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n\ \ }\n}\n```" repo_url: https://huggingface.co/s1ghhh/medllama-2-70b-qlora-1.1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|arc:challenge|25_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-09T01-34-27.623935.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_28T23_37_36.261412 path: - '**/details_harness|drop|3_2023-10-28T23-37-36.261412.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-28T23-37-36.261412.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_28T23_37_36.261412 path: - '**/details_harness|gsm8k|5_2023-10-28T23-37-36.261412.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-28T23-37-36.261412.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hellaswag|10_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T01-34-27.623935.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T01-34-27.623935.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_09T01_34_27.623935 path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T01-34-27.623935.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T01-34-27.623935.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_28T23_37_36.261412 path: - '**/details_harness|winogrande|5_2023-10-28T23-37-36.261412.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-28T23-37-36.261412.parquet' - config_name: results data_files: - split: 2023_10_09T01_34_27.623935 path: - results_2023-10-09T01-34-27.623935.parquet - split: 2023_10_28T23_37_36.261412 path: - results_2023-10-28T23-37-36.261412.parquet - split: latest path: - results_2023-10-28T23-37-36.261412.parquet --- # Dataset Card for Evaluation run of s1ghhh/medllama-2-70b-qlora-1.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/s1ghhh/medllama-2-70b-qlora-1.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [s1ghhh/medllama-2-70b-qlora-1.1](https://huggingface.co/s1ghhh/medllama-2-70b-qlora-1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T23:37:36.261412](https://huggingface.co/datasets/open-llm-leaderboard/details_s1ghhh__medllama-2-70b-qlora-1.1/blob/main/results_2023-10-28T23-37-36.261412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.4476719798657718, "em_stderr": 0.005092348829658167, "f1": 0.49099203020134397, "f1_stderr": 0.004914477006067904, "acc": 0.5814221507886975, "acc_stderr": 0.011551816841221033 }, "harness|drop|3": { "em": 0.4476719798657718, "em_stderr": 0.005092348829658167, "f1": 0.49099203020134397, "f1_stderr": 0.004914477006067904 }, "harness|gsm8k|5": { "acc": 0.3206974981046247, "acc_stderr": 0.012856468433722304 }, "harness|winogrande|5": { "acc": 0.8421468034727704, "acc_stderr": 0.010247165248719763 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,678
[ [ -0.027099609375, -0.045135498046875, 0.01788330078125, 0.01654052734375, -0.01544952392578125, 0.0053558349609375, -0.0254364013671875, -0.01303863525390625, 0.030731201171875, 0.040679931640625, -0.0552978515625, -0.07061767578125, -0.04510498046875, 0.0105...
adam89/TinyStoriesChinese
2023-10-10T08:18:25.000Z
[ "license:cdla-sharing-1.0", "region:us" ]
adam89
null
null
1
0
2023-10-09T02:02:03
--- license: cdla-sharing-1.0 --- [TinyStories](https://huggingface.co/datasets/roneneldan/TinyStories)数据集的中文翻译版。只翻译了`story`字段(翻译后字段为`story_zh`): ```json { "story": "\n\nLily and Ben are friends. They like to play in the park. One day, they see a big tree with a swing. Lily wants to try the swing. She runs to the tree and climbs on the swing.\n\"Push me, Ben!\" she says. Ben pushes her gently. Lily feels happy. She swings higher and higher. She laughs and shouts.\nBen watches Lily. He thinks she is cute. He wants to swing too. He waits for Lily to stop. But Lily does not stop. She swings faster and faster. She is having too much fun.\n\"Can I swing too, Lily?\" Ben asks. Lily does not hear him. She is too busy swinging. Ben feels sad. He walks away.\nLily swings so high that she loses her grip. She falls off the swing. She lands on the ground. She hurts her foot. She cries.\n\"Ow, ow, ow!\" she says. She looks for Ben. She wants him to help her. But Ben is not there. He is gone.\nLily feels sorry. She wishes she had shared the swing with Ben. She wishes he was there to hug her. She limps to the tree. She sees something hanging from a branch. It is Ben's hat. He left it for her.\nLily smiles. She thinks Ben is nice. She puts on his hat. She hopes he will come back. She wants to say sorry. She wants to be friends again.", "instruction": { "prompt:": "Write a short story (3-5 paragraphs) which only uses very simple words that a 3 year old child would understand. The story should use the verb \"hang\", the noun \"foot\" and the adjective \"cute\". The story has the following features: the story should contain at least one dialogue. Remember to only use simple words!\n\nPossible story:", "words": [ "hang", "foot", "cute" ], "features": [ "Dialogue" ] }, "summary": "Lily and Ben play in the park and Lily gets too caught up in swinging, causing Ben to leave. Lily falls off the swing and hurts herself, but Ben leaves his hat for her as a kind gesture.", "source": "GPT-4", "story_zh": "莉莉和本是朋友。他们喜欢在公园里玩。有一天,他们在一棵大树下看到了一个秋千。莉莉想试试那个秋千。她跑到树下,爬上了秋千。\n\"推我,本!\"她说。本轻轻地推了她一下。莉莉感到很开心。她越荡越高,笑着喊叫。\n本看着莉莉。他觉得她很可爱。他也想荡秋千。他在莉莉停下来之后等着。但是莉莉没有停下来。她越荡越快。她玩得太高兴了。\n\"我也可以荡秋千吗,莉莉?\"本问。莉莉没听到他的话。她忙着荡秋千。本觉得很难过。他走开了。\n莉莉荡得太高,失去了平衡。她从秋千上摔下来,落在地上。她扭伤了脚。她哭了起来。\n\"哎呀,哎呀,哎呀!\"她说。她在找本。她希望他能帮助她。但本不在那里。他走了。\n莉莉感到很抱歉。她希望她能和本分享秋千。她希望他在那里拥抱她。她一瘸一拐地走到树下。她看到有什么东西挂在树枝上。那是本的帽子。他留给她的。\n莉莉笑了。她觉得本很好。她戴上了他的帽子。她希望他会回来。她想道歉。她想再次成为朋友。" } ``` 可以看一下翻译效果: ```text Lily and Ben are friends. They like to play in the park. One day, they see a big tree with a swing. Lily wants to try the swing. She runs to the tree and climbs on the swing. "Push me, Ben!" she says. Ben pushes her gently. Lily feels happy. She swings higher and higher. She laughs and shouts. Ben watches Lily. He thinks she is cute. He wants to swing too. He waits for Lily to stop. But Lily does not stop. She swings faster and faster. She is having too much fun. "Can I swing too, Lily?" Ben asks. Lily does not hear him. She is too busy swinging. Ben feels sad. He walks away. Lily swings so high that she loses her grip. She falls off the swing. She lands on the ground. She hurts her foot. She cries. "Ow, ow, ow!" she says. She looks for Ben. She wants him to help her. But Ben is not there. He is gone. Lily feels sorry. She wishes she had shared the swing with Ben. She wishes he was there to hug her. She limps to the tree. She sees something hanging from a branch. It is Ben's hat. He left it for her. Lily smiles. She thinks Ben is nice. She puts on his hat. She hopes he will come back. She wants to say sorry. She wants to be friends again. 莉莉和本是朋友。他们喜欢在公园里玩。有一天,他们在一棵大树下看到了一个秋千。莉莉想试试那个秋千。她跑到树下,爬上了秋千。 "推我,本!"她说。本轻轻地推了她一下。莉莉感到很开心。她越荡越高,笑着喊叫。 本看着莉莉。他觉得她很可爱。他也想荡秋千。他在莉莉停下来之后等着。但是莉莉没有停下来。她越荡越快。她玩得太高兴了。 "我也可以荡秋千吗,莉莉?"本问。莉莉没听到他的话。她忙着荡秋千。本觉得很难过。他走开了。 莉莉荡得太高,失去了平衡。她从秋千上摔下来,落在地上。她扭伤了脚。她哭了起来。 "哎呀,哎呀,哎呀!"她说。她在找本。她希望他能帮助她。但本不在那里。他走了。 莉莉感到很抱歉。她希望她能和本分享秋千。她希望他在那里拥抱她。她一瘸一拐地走到树下。她看到有什么东西挂在树枝上。那是本的帽子。他留给她的。 莉莉笑了。她觉得本很好。她戴上了他的帽子。她希望他会回来。她想道歉。她想再次成为朋友。 ``` ```text Once upon a time, there was a little girl named Lily. She had a teddy bear that she loved so much. One day, she lost it while playing in the park. She looked everywhere, but she couldn't find it. She felt sad and scared without her teddy bear. Lily's mommy saw her crying and asked what was wrong. Lily told her that she lost her teddy bear. Mommy hugged her and said, "Don't worry, we'll search for it together." They went back to the park and looked everywhere. After a while, they found the teddy bear under a tree. Lily was so happy! She hugged her teddy bear and felt comfortable again. She said, "I hope I never lose you again, teddy bear." Mommy smiled and said, "Me too, Lily. You and teddy bear are the best of friends." And they all went home, happy and content. The end. 从前,有一个小女孩叫莉莉。她非常喜欢她的泰迪熊。有一天,她在公园里玩时把它弄丢了。她找遍了所有地方,但仍然找不到它。没有她的泰迪熊,她感到很难过和害怕。 莉莉的妈妈看到她哭泣,问她发生了什么事。莉莉告诉她自己把泰迪熊弄丢了。妈妈抱住她说:“别担心,我们会一起去找的。”他们回到公园,到处寻找。过了一会儿,他们在树下找到了泰迪熊。莉莉非常高兴! 她拥抱了她的泰迪熊,感觉又舒服了。她说:“我希望我再也不要失去你,泰迪熊。”妈妈笑着说:“我也这么想,莉莉。你和泰迪熊是最好的朋友。”然后他们都高高兴兴地回家了,感到非常满足。结束。 ``` ```text Once upon a time, there was a cute puppy named Max. Max was very adorable with his big, brown eyes and wagging tail. One day, Max's owner, Emily, told him that they needed to go to the post office to mail a letter. Max didn't know what that meant, but he was excited to go for a car ride. At the post office, Emily gave the letter to the nice lady behind the desk. The lady asked Emily for a number and Emily gave her one. Max didn't know what a number was, but he saw the lady type something on the computer. After they mailed the letter, Emily and Max went back to the car. Max was happy that they went on an adventure and he couldn't wait for the next one. 从前,有一只可爱的狗狗名叫Max。Max 非常可爱,大大的棕色眼睛和摇摆的尾巴都让人喜欢。有一天,Emily告诉Max他们需要去邮局寄一封信。Max并不知道那是什么意思,但他很兴奋能去兜风。 在邮局,Emily把信交给柜台后面友好的女士。女士问Emily要了一个号码,Emily给了她一个。Max并不知道什么是号码,但看到女士在电脑上输入了一些东西。 寄完信后,Emily和Max回到了车里。Max很高兴他们去了一趟冒险,他迫不及待地期待着下一次冒险。 ``` ```text One day, a kind and honest cat named Tom found a pretty velvet ribbon. He wanted to hang it on his door. But when he tried to hang it, he saw his friend, a small bird named Sue, was sad. Sue wanted the velvet ribbon too. Tom did not want to make Sue sad. So, he thought of a way to share the ribbon. He knew that they both liked to play games. Tom said, "Let's play a game. We can take turns to have the ribbon. Today, I will hang it on my door. Tomorrow, you can hang it on your tree." Sue liked this idea. They played and shared the velvet ribbon every day. Tom and Sue were both happy. They learned that sharing is a good way to solve problems and stay friends. 一天,一只名叫汤姆的善良诚实的猫发现了一条漂亮的天鹅绒彩带。他想把它挂在门上。但当他尝试挂上时,看到他的朋友,一只名叫苏的小鸟,很伤心。 苏也想要这条天鹅绒彩带。汤姆不想让苏伤心。所以,他想到了一个分享彩带的方法。他知道他们都喜欢玩游戏。 汤姆说:“我们来玩个游戏吧。我们可以轮流拥有这条彩带。今天,我把它挂在我门上。明天,你可以把它挂在你树上。” 苏喜欢这个主意。他们每天都会玩游戏并分享这根天鹅绒彩带。汤姆和苏都很开心。他们学会了分享是一种解决问题的良好方法,也是保持友谊的好方法。 ```
7,015
[ [ -0.02349853515625, -0.0592041015625, 0.0292205810546875, 0.0457763671875, -0.0389404296875, -0.021453857421875, 0.0171966552734375, -0.06329345703125, 0.04510498046875, 0.024749755859375, -0.0546875, -0.0259552001953125, -0.03558349609375, 0.0129852294921875...
open-llm-leaderboard/details_pankajmathur__model_007
2023-10-09T02:04:34.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-09T02:03:33
--- pretty_name: Evaluation run of pankajmathur/model_007 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [pankajmathur/model_007](https://huggingface.co/pankajmathur/model_007) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__model_007\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-09T02:03:09.335068](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__model_007/blob/main/results_2023-10-09T02-03-09.335068.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6901502879968988,\n\ \ \"acc_stderr\": 0.031344534847114004,\n \"acc_norm\": 0.6939037892141556,\n\ \ \"acc_norm_stderr\": 0.03131458982120537,\n \"mc1\": 0.44920440636474906,\n\ \ \"mc1_stderr\": 0.01741294198611531,\n \"mc2\": 0.6312306236860621,\n\ \ \"mc2_stderr\": 0.014945471343395618\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.01368814730972912,\n\ \ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6908982274447322,\n\ \ \"acc_stderr\": 0.004611787665905346,\n \"acc_norm\": 0.8765186217884884,\n\ \ \"acc_norm_stderr\": 0.003283165867631372\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\ \ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\ \ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\ \ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\ \ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \ \ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\ \ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\ \ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\ \ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\ : 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\ \ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\ \ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\ \ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\ \ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745657,\n\ \ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745657\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\ \ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\ \ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\ \ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\ acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\ \ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\ \ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \ \ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"\ acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"\ acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\ : 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706467,\n\ \ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706467\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821678,\n \"\ acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821678\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078894,\n\ \ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078894\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\ \ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \ \ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7436974789915967,\n \"acc_stderr\": 0.02835962087053395,\n \ \ \"acc_norm\": 0.7436974789915967,\n \"acc_norm_stderr\": 0.02835962087053395\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\ acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8899082568807339,\n \"acc_stderr\": 0.0134199390186812,\n \"acc_norm\"\ : 0.8899082568807339,\n \"acc_norm_stderr\": 0.0134199390186812\n },\n\ \ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n\ \ \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n\ \ \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\ : {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658925,\n\ \ \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658925\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \ \ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\ \ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\ \ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\ \ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445815,\n \"\ acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445815\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\ \ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\ \ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\ \ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\ \ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\ \ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822582,\n\ \ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822582\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\ \ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\ \ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\ \ \"acc_stderr\": 0.011832954239305724,\n \"acc_norm\": 0.8748403575989783,\n\ \ \"acc_norm_stderr\": 0.011832954239305724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\ \ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5497206703910614,\n\ \ \"acc_stderr\": 0.016639615236845817,\n \"acc_norm\": 0.5497206703910614,\n\ \ \"acc_norm_stderr\": 0.016639615236845817\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182651,\n\ \ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182651\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\ \ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\ \ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.02202136610022019,\n\ \ \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.02202136610022019\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291477,\n \ \ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291477\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.545632333767927,\n\ \ \"acc_stderr\": 0.012716941720734818,\n \"acc_norm\": 0.545632333767927,\n\ \ \"acc_norm_stderr\": 0.012716941720734818\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377197,\n\ \ \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377197\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\ : 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\ : {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n\ \ \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n\ \ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7918367346938775,\n\ \ \"acc_stderr\": 0.025991117672813296,\n \"acc_norm\": 0.7918367346938775,\n\ \ \"acc_norm_stderr\": 0.025991117672813296\n },\n \"harness|hendrycksTest-sociology|5\"\ : {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n\ \ \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n\ \ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\ \ 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n\ \ \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\"\ : {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n\ \ \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n\ \ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n\ \ \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n\ \ \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\"\ : {\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.01741294198611531,\n\ \ \"mc2\": 0.6312306236860621,\n \"mc2_stderr\": 0.014945471343395618\n\ \ }\n}\n```" repo_url: https://huggingface.co/pankajmathur/model_007 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|arc:challenge|25_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hellaswag|10_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_09T02_03_09.335068 path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T02-03-09.335068.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T02-03-09.335068.parquet' - config_name: results data_files: - split: 2023_10_09T02_03_09.335068 path: - results_2023-10-09T02-03-09.335068.parquet - split: latest path: - results_2023-10-09T02-03-09.335068.parquet --- # Dataset Card for Evaluation run of pankajmathur/model_007 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/pankajmathur/model_007 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [pankajmathur/model_007](https://huggingface.co/pankajmathur/model_007) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pankajmathur__model_007", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-09T02:03:09.335068](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__model_007/blob/main/results_2023-10-09T02-03-09.335068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6901502879968988, "acc_stderr": 0.031344534847114004, "acc_norm": 0.6939037892141556, "acc_norm_stderr": 0.03131458982120537, "mc1": 0.44920440636474906, "mc1_stderr": 0.01741294198611531, "mc2": 0.6312306236860621, "mc2_stderr": 0.014945471343395618 }, "harness|arc:challenge|25": { "acc": 0.6749146757679181, "acc_stderr": 0.01368814730972912, "acc_norm": 0.7107508532423208, "acc_norm_stderr": 0.013250012579393441 }, "harness|hellaswag|10": { "acc": 0.6908982274447322, "acc_stderr": 0.004611787665905346, "acc_norm": 0.8765186217884884, "acc_norm_stderr": 0.003283165867631372 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939098, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939098 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8157894736842105, "acc_stderr": 0.0315469804508223, "acc_norm": 0.8157894736842105, "acc_norm_stderr": 0.0315469804508223 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7320754716981132, "acc_stderr": 0.027257260322494845, "acc_norm": 0.7320754716981132, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8125, "acc_stderr": 0.032639560491693344, "acc_norm": 0.8125, "acc_norm_stderr": 0.032639560491693344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6680851063829787, "acc_stderr": 0.030783736757745657, "acc_norm": 0.6680851063829787, "acc_norm_stderr": 0.030783736757745657 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4298245614035088, "acc_stderr": 0.046570472605949625, "acc_norm": 0.4298245614035088, "acc_norm_stderr": 0.046570472605949625 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.0407032901370707, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.0407032901370707 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.455026455026455, "acc_stderr": 0.025646928361049398, "acc_norm": 0.455026455026455, "acc_norm_stderr": 0.025646928361049398 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677173, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8129032258064516, "acc_stderr": 0.022185710092252252, "acc_norm": 0.8129032258064516, "acc_norm_stderr": 0.022185710092252252 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5369458128078818, "acc_stderr": 0.035083705204426656, "acc_norm": 0.5369458128078818, "acc_norm_stderr": 0.035083705204426656 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8666666666666667, "acc_stderr": 0.026544435312706467, "acc_norm": 0.8666666666666667, "acc_norm_stderr": 0.026544435312706467 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8888888888888888, "acc_stderr": 0.02239078763821678, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.02239078763821678 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9326424870466321, "acc_stderr": 0.018088393839078894, "acc_norm": 0.9326424870466321, "acc_norm_stderr": 0.018088393839078894 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7102564102564103, "acc_stderr": 0.023000628243687968, "acc_norm": 0.7102564102564103, "acc_norm_stderr": 0.023000628243687968 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028597, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028597 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7436974789915967, "acc_stderr": 0.02835962087053395, "acc_norm": 0.7436974789915967, "acc_norm_stderr": 0.02835962087053395 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4768211920529801, "acc_stderr": 0.04078093859163083, "acc_norm": 0.4768211920529801, "acc_norm_stderr": 0.04078093859163083 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8899082568807339, "acc_stderr": 0.0134199390186812, "acc_norm": 0.8899082568807339, "acc_norm_stderr": 0.0134199390186812 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9215686274509803, "acc_stderr": 0.018869514646658925, "acc_norm": 0.9215686274509803, "acc_norm_stderr": 0.018869514646658925 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8734177215189873, "acc_stderr": 0.021644195727955173, "acc_norm": 0.8734177215189873, "acc_norm_stderr": 0.021644195727955173 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7982062780269058, "acc_stderr": 0.026936111912802273, "acc_norm": 0.7982062780269058, "acc_norm_stderr": 0.026936111912802273 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.816793893129771, "acc_stderr": 0.03392770926494733, "acc_norm": 0.816793893129771, "acc_norm_stderr": 0.03392770926494733 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.030922788320445815, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.030922788320445815 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8240740740740741, "acc_stderr": 0.036809181416738807, "acc_norm": 0.8240740740740741, "acc_norm_stderr": 0.036809181416738807 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742179, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742179 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822582, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822582 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9017094017094017, "acc_stderr": 0.019503444900757567, "acc_norm": 0.9017094017094017, "acc_norm_stderr": 0.019503444900757567 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8748403575989783, "acc_stderr": 0.011832954239305724, "acc_norm": 0.8748403575989783, "acc_norm_stderr": 0.011832954239305724 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7630057803468208, "acc_stderr": 0.02289408248992599, "acc_norm": 0.7630057803468208, "acc_norm_stderr": 0.02289408248992599 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5497206703910614, "acc_stderr": 0.016639615236845817, "acc_norm": 0.5497206703910614, "acc_norm_stderr": 0.016639615236845817 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.02555316999182651, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.02555316999182651 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7684887459807074, "acc_stderr": 0.023956532766639133, "acc_norm": 0.7684887459807074, "acc_norm_stderr": 0.023956532766639133 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8055555555555556, "acc_stderr": 0.02202136610022019, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.02202136610022019 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5460992907801419, "acc_stderr": 0.029700453247291477, "acc_norm": 0.5460992907801419, "acc_norm_stderr": 0.029700453247291477 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.545632333767927, "acc_stderr": 0.012716941720734818, "acc_norm": 0.545632333767927, "acc_norm_stderr": 0.012716941720734818 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7316176470588235, "acc_stderr": 0.026917481224377197, "acc_norm": 0.7316176470588235, "acc_norm_stderr": 0.026917481224377197 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.75, "acc_stderr": 0.01751781884501444, "acc_norm": 0.75, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7363636363636363, "acc_stderr": 0.04220224692971987, "acc_norm": 0.7363636363636363, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7918367346938775, "acc_stderr": 0.025991117672813296, "acc_norm": 0.7918367346938775, "acc_norm_stderr": 0.025991117672813296 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.02484575321230604, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.02484575321230604 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.026168221344662297, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.026168221344662297 }, "harness|truthfulqa:mc|0": { "mc1": 0.44920440636474906, "mc1_stderr": 0.01741294198611531, "mc2": 0.6312306236860621, "mc2_stderr": 0.014945471343395618 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
64,792
[ [ -0.05157470703125, -0.061126708984375, 0.0177154541015625, 0.01495361328125, -0.0122222900390625, -0.003467559814453125, 0.001758575439453125, -0.01397705078125, 0.037841796875, -0.0020313262939453125, -0.032318115234375, -0.048004150390625, -0.03228759765625, ...
open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b
2023-10-09T02:13:29.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-09T02:12:28
--- pretty_name: Evaluation run of pankajmathur/orca_mini_v3_70b dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [pankajmathur/orca_mini_v3_70b](https://huggingface.co/pankajmathur/orca_mini_v3_70b)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-09T02:12:05.216705](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b/blob/main/results_2023-10-09T02-12-05.216705.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7010508529623596,\n\ \ \"acc_stderr\": 0.0309286120388273,\n \"acc_norm\": 0.7049679984523141,\n\ \ \"acc_norm_stderr\": 0.030896356315399304,\n \"mc1\": 0.42962056303549573,\n\ \ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\ \ \"mc2_stderr\": 0.015087648780065216\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.013796182947785562,\n\ \ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6951802429794861,\n\ \ \"acc_stderr\": 0.00459390260197934,\n \"acc_norm\": 0.8785102569209321,\n\ \ \"acc_norm_stderr\": 0.0032602788112468337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\ \ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\ \ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \ \ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708052,\n\ \ \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708052\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\ \ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\ \ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\ \ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\ \ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\ \ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.03013590647851756,\n\ \ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.03013590647851756\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\ \ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\ \ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\ \ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.48148148148148145,\n \"acc_stderr\": 0.02573364199183898,\n \"\ acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.02573364199183898\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\ \ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\ \ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8161290322580645,\n \"acc_stderr\": 0.02203721734026783,\n \"\ acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.02203721734026783\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"\ acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\ : 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n\ \ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"\ acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n\ \ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\ \ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \ \ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277723,\n\ \ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277723\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\ acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"\ acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\ acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\ acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \ \ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\ \ \"acc_stderr\": 0.027991534258519513,\n \"acc_norm\": 0.7757847533632287,\n\ \ \"acc_norm_stderr\": 0.027991534258519513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\ \ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\ acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\ \ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\ \ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\ \ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\ \ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\ \ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\ \ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\ \ \"acc_stderr\": 0.011832954239305733,\n \"acc_norm\": 0.8748403575989783,\n\ \ \"acc_norm_stderr\": 0.011832954239305733\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.0218552552634218,\n\ \ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.0218552552634218\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5575418994413408,\n\ \ \"acc_stderr\": 0.01661139368726857,\n \"acc_norm\": 0.5575418994413408,\n\ \ \"acc_norm_stderr\": 0.01661139368726857\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958157,\n\ \ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958157\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\ \ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\ \ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.021330868762127062,\n\ \ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.021330868762127062\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5851063829787234,\n \"acc_stderr\": 0.0293922365846125,\n \ \ \"acc_norm\": 0.5851063829787234,\n \"acc_norm_stderr\": 0.0293922365846125\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.559973924380704,\n\ \ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.559973924380704,\n\ \ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\ \ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427653,\n \ \ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427653\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\ \ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\ \ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \ \ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\ \ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\ \ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\ \ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \ \ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\ \ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\ \ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\ \ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\ \ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\ \ \"mc2_stderr\": 0.015087648780065216\n }\n}\n```" repo_url: https://huggingface.co/pankajmathur/orca_mini_v3_70b leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|arc:challenge|25_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hellaswag|10_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-12-05.216705.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_09T02_12_05.216705 path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T02-12-05.216705.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T02-12-05.216705.parquet' - config_name: results data_files: - split: 2023_10_09T02_12_05.216705 path: - results_2023-10-09T02-12-05.216705.parquet - split: latest path: - results_2023-10-09T02-12-05.216705.parquet --- # Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_70b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/pankajmathur/orca_mini_v3_70b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [pankajmathur/orca_mini_v3_70b](https://huggingface.co/pankajmathur/orca_mini_v3_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-09T02:12:05.216705](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_70b/blob/main/results_2023-10-09T02-12-05.216705.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7010508529623596, "acc_stderr": 0.0309286120388273, "acc_norm": 0.7049679984523141, "acc_norm_stderr": 0.030896356315399304, "mc1": 0.42962056303549573, "mc1_stderr": 0.017329234580409098, "mc2": 0.6126968953087459, "mc2_stderr": 0.015087648780065216 }, "harness|arc:challenge|25": { "acc": 0.6646757679180887, "acc_stderr": 0.013796182947785562, "acc_norm": 0.712457337883959, "acc_norm_stderr": 0.013226719056266129 }, "harness|hellaswag|10": { "acc": 0.6951802429794861, "acc_stderr": 0.00459390260197934, "acc_norm": 0.8785102569209321, "acc_norm_stderr": 0.0032602788112468337 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.04171654161354543, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.04171654161354543 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8157894736842105, "acc_stderr": 0.0315469804508223, "acc_norm": 0.8157894736842105, "acc_norm_stderr": 0.0315469804508223 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.77, "acc_stderr": 0.042295258468165044, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165044 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7396226415094339, "acc_stderr": 0.027008766090708052, "acc_norm": 0.7396226415094339, "acc_norm_stderr": 0.027008766090708052 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8125, "acc_stderr": 0.032639560491693344, "acc_norm": 0.8125, "acc_norm_stderr": 0.032639560491693344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.036146654241808254, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.036146654241808254 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6936170212765957, "acc_stderr": 0.03013590647851756, "acc_norm": 0.6936170212765957, "acc_norm_stderr": 0.03013590647851756 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6344827586206897, "acc_stderr": 0.040131241954243856, "acc_norm": 0.6344827586206897, "acc_norm_stderr": 0.040131241954243856 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.02573364199183898, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.02573364199183898 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04444444444444449, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8161290322580645, "acc_stderr": 0.02203721734026783, "acc_norm": 0.8161290322580645, "acc_norm_stderr": 0.02203721734026783 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5665024630541872, "acc_stderr": 0.034867317274198714, "acc_norm": 0.5665024630541872, "acc_norm_stderr": 0.034867317274198714 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8424242424242424, "acc_stderr": 0.02845038880528436, "acc_norm": 0.8424242424242424, "acc_norm_stderr": 0.02845038880528436 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8737373737373737, "acc_stderr": 0.023664359402880232, "acc_norm": 0.8737373737373737, "acc_norm_stderr": 0.023664359402880232 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9326424870466321, "acc_stderr": 0.0180883938390789, "acc_norm": 0.9326424870466321, "acc_norm_stderr": 0.0180883938390789 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7025641025641025, "acc_stderr": 0.023177408131465942, "acc_norm": 0.7025641025641025, "acc_norm_stderr": 0.023177408131465942 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114986, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114986 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7478991596638656, "acc_stderr": 0.028205545033277723, "acc_norm": 0.7478991596638656, "acc_norm_stderr": 0.028205545033277723 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5033112582781457, "acc_stderr": 0.04082393379449654, "acc_norm": 0.5033112582781457, "acc_norm_stderr": 0.04082393379449654 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9009174311926605, "acc_stderr": 0.01280978008187893, "acc_norm": 0.9009174311926605, "acc_norm_stderr": 0.01280978008187893 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5879629629629629, "acc_stderr": 0.03356787758160831, "acc_norm": 0.5879629629629629, "acc_norm_stderr": 0.03356787758160831 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9264705882352942, "acc_stderr": 0.018318855850089678, "acc_norm": 0.9264705882352942, "acc_norm_stderr": 0.018318855850089678 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9029535864978903, "acc_stderr": 0.019269323025640255, "acc_norm": 0.9029535864978903, "acc_norm_stderr": 0.019269323025640255 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7757847533632287, "acc_stderr": 0.027991534258519513, "acc_norm": 0.7757847533632287, "acc_norm_stderr": 0.027991534258519513 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8625954198473282, "acc_stderr": 0.030194823996804475, "acc_norm": 0.8625954198473282, "acc_norm_stderr": 0.030194823996804475 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8760330578512396, "acc_stderr": 0.030083098716035202, "acc_norm": 0.8760330578512396, "acc_norm_stderr": 0.030083098716035202 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8240740740740741, "acc_stderr": 0.036809181416738807, "acc_norm": 0.8240740740740741, "acc_norm_stderr": 0.036809181416738807 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8159509202453987, "acc_stderr": 0.030446777687971726, "acc_norm": 0.8159509202453987, "acc_norm_stderr": 0.030446777687971726 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9145299145299145, "acc_stderr": 0.01831589168562585, "acc_norm": 0.9145299145299145, "acc_norm_stderr": 0.01831589168562585 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8748403575989783, "acc_stderr": 0.011832954239305733, "acc_norm": 0.8748403575989783, "acc_norm_stderr": 0.011832954239305733 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.791907514450867, "acc_stderr": 0.0218552552634218, "acc_norm": 0.791907514450867, "acc_norm_stderr": 0.0218552552634218 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5575418994413408, "acc_stderr": 0.01661139368726857, "acc_norm": 0.5575418994413408, "acc_norm_stderr": 0.01661139368726857 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.025058503316958157, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.025058503316958157 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.77491961414791, "acc_stderr": 0.023720088516179027, "acc_norm": 0.77491961414791, "acc_norm_stderr": 0.023720088516179027 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8209876543209876, "acc_stderr": 0.021330868762127062, "acc_norm": 0.8209876543209876, "acc_norm_stderr": 0.021330868762127062 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5851063829787234, "acc_stderr": 0.0293922365846125, "acc_norm": 0.5851063829787234, "acc_norm_stderr": 0.0293922365846125 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.559973924380704, "acc_stderr": 0.012678037478574513, "acc_norm": 0.559973924380704, "acc_norm_stderr": 0.012678037478574513 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7169117647058824, "acc_stderr": 0.02736586113151381, "acc_norm": 0.7169117647058824, "acc_norm_stderr": 0.02736586113151381 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7549019607843137, "acc_stderr": 0.017401816711427653, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.017401816711427653 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.043091187099464585, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8, "acc_stderr": 0.02560737598657916, "acc_norm": 0.8, "acc_norm_stderr": 0.02560737598657916 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8756218905472637, "acc_stderr": 0.023335401790166327, "acc_norm": 0.8756218905472637, "acc_norm_stderr": 0.023335401790166327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.028762349126466125, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466125 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.026168221344662297, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.026168221344662297 }, "harness|truthfulqa:mc|0": { "mc1": 0.42962056303549573, "mc1_stderr": 0.017329234580409098, "mc2": 0.6126968953087459, "mc2_stderr": 0.015087648780065216 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
64,871
[ [ -0.051300048828125, -0.05926513671875, 0.0170745849609375, 0.0130462646484375, -0.01294708251953125, -0.0051116943359375, 0.0014400482177734375, -0.0159454345703125, 0.03985595703125, -0.0014743804931640625, -0.03472900390625, -0.04864501953125, -0.0312042236328...
Brthy467/mixeddatasets
2023-10-09T02:47:53.000Z
[ "region:us" ]
Brthy467
null
null
0
0
2023-10-09T02:46:50
Entry not found
15
[ [ -0.0213775634765625, -0.01497650146484375, 0.05718994140625, 0.02880859375, -0.0350341796875, 0.046478271484375, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.0170135498046875, -0.052093505859375, -0.01497650146484375, -0.0604248046875, 0.0379028...
QEU/databricks-dolly-16k-line_ja-4_of_4
2023-10-09T03:14:00.000Z
[ "license:apache-2.0", "region:us" ]
QEU
null
null
0
0
2023-10-09T03:09:56
--- license: apache-2.0 --- # このデータセットは、2023年に有名になったdatabrick-15kの日本語版です。 ## ただし、データは4分割されています。 ## データの内容は非常に変わっています。(半分ぐらいは、原型をとどめていません) - カタカナ語にカッコ付けで英語を追記しました。 - このデータセットには、QnAとして異常なレコードが見られることから修正しました。 - 「ゲームオブスローン」に関するトリビアなど、情報価値が低いものは削除しました。 - その他、いろいろなトライアルとして情報を追加しました。 詳しい情報は[こちらのブログ](https://jpnqeur23lmqsw.blogspot.com/2023/09/qeur23llmdss10-databricks15k.html)を参考にしてください。
388
[ [ -0.04833984375, -0.0693359375, 0.0321044921875, 0.055084228515625, -0.040985107421875, 0.0078125, 0.02264404296875, -0.01554107666015625, 0.0300750732421875, 0.02264404296875, -0.057373046875, -0.05078125, -0.02325439453125, 0.01239776611328125, -0.01905...
naphatmanu/index_modern_1
2023-10-09T03:37:03.000Z
[ "region:us" ]
naphatmanu
null
null
0
0
2023-10-09T03:36:59
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
naphatmanu/index-modern-luxury-1
2023-10-09T03:46:23.000Z
[ "region:us" ]
naphatmanu
null
null
0
0
2023-10-09T03:46:20
Entry not found
15
[ [ -0.021392822265625, -0.01494598388671875, 0.05718994140625, 0.028839111328125, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.01702880859375, -0.052093505859375, -0.01494598388671875, -0.06036376953125, 0.03790...
Brthy467/mixedDataset2
2023-10-10T01:07:45.000Z
[ "region:us" ]
Brthy467
null
null
0
0
2023-10-09T03:56:48
Entry not found
15
[ [ -0.021392822265625, -0.01494598388671875, 0.05718994140625, 0.028839111328125, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.01702880859375, -0.052093505859375, -0.01494598388671875, -0.06036376953125, 0.03790...
open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf
2023-10-26T23:30:23.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-09T04:04:43
--- pretty_name: Evaluation run of elliotthwang/elliott_Llama-2-7b-hf dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [elliotthwang/elliott_Llama-2-7b-hf](https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-26T23:30:10.386120](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf/blob/main/results_2023-10-26T23-30-10.386120.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\ \ \"em_stderr\": 0.00034761798968571027,\n \"f1\": 0.05575817953020141,\n\ \ \"f1_stderr\": 0.001306153544964195,\n \"acc\": 0.4026884110741377,\n\ \ \"acc_stderr\": 0.009681922567248534\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968571027,\n\ \ \"f1\": 0.05575817953020141,\n \"f1_stderr\": 0.001306153544964195\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06899166034874905,\n \ \ \"acc_stderr\": 0.006980995834838602\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658464\n\ \ }\n}\n```" repo_url: https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|arc:challenge|25_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-09T04-04-19.372525.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_26T23_30_10.386120 path: - '**/details_harness|drop|3_2023-10-26T23-30-10.386120.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-26T23-30-10.386120.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_26T23_30_10.386120 path: - '**/details_harness|gsm8k|5_2023-10-26T23-30-10.386120.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-26T23-30-10.386120.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hellaswag|10_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T04-04-19.372525.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_09T04_04_19.372525 path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T04-04-19.372525.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T04-04-19.372525.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_26T23_30_10.386120 path: - '**/details_harness|winogrande|5_2023-10-26T23-30-10.386120.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-26T23-30-10.386120.parquet' - config_name: results data_files: - split: 2023_10_09T04_04_19.372525 path: - results_2023-10-09T04-04-19.372525.parquet - split: 2023_10_26T23_30_10.386120 path: - results_2023-10-26T23-30-10.386120.parquet - split: latest path: - results_2023-10-26T23-30-10.386120.parquet --- # Dataset Card for Evaluation run of elliotthwang/elliott_Llama-2-7b-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [elliotthwang/elliott_Llama-2-7b-hf](https://huggingface.co/elliotthwang/elliott_Llama-2-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T23:30:10.386120](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__elliott_Llama-2-7b-hf/blob/main/results_2023-10-26T23-30-10.386120.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968571027, "f1": 0.05575817953020141, "f1_stderr": 0.001306153544964195, "acc": 0.4026884110741377, "acc_stderr": 0.009681922567248534 }, "harness|drop|3": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968571027, "f1": 0.05575817953020141, "f1_stderr": 0.001306153544964195 }, "harness|gsm8k|5": { "acc": 0.06899166034874905, "acc_stderr": 0.006980995834838602 }, "harness|winogrande|5": { "acc": 0.7363851617995264, "acc_stderr": 0.012382849299658464 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,732
[ [ -0.031585693359375, -0.046112060546875, 0.0255584716796875, 0.021392822265625, -0.0167999267578125, 0.0163421630859375, -0.0277557373046875, -0.0228118896484375, 0.03369140625, 0.0386962890625, -0.05621337890625, -0.07220458984375, -0.051666259765625, 0.0206...
yukuai0011/elec5307-project-2-dataset-full-public
2023-10-09T16:20:58.000Z
[ "region:us" ]
yukuai0011
null
null
0
0
2023-10-09T04:09:27
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': Apple '1': Avocado '2': Banana '3': Blueberry '4': Coconut '5': Cucumber '6': Dragon_fruit '7': Grape '8': Grapefruit '9': Kiwifruit '10': Lemon '11': Lychee '12': Mangoes '13': Orange '14': Papaya '15': Passion fruit '16': Peach '17': Pear '18': Pineapple '19': Pomegranate '20': Raspberry '21': Rockmelon '22': Strawberries '23': Tomato '24': Waterlemon splits: - name: train num_bytes: 344011868.018 num_examples: 3026 download_size: 319895933 dataset_size: 344011868.018 --- # Dataset Card for "elec5307-project-2-dataset-full-public" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
1,152
[ [ -0.039093017578125, 0.00438690185546875, 0.0186004638671875, 0.0305023193359375, -0.0213470458984375, -0.0006489753723144531, 0.006061553955078125, -0.0242919921875, 0.06158447265625, 0.044464111328125, -0.06170654296875, -0.0648193359375, -0.03741455078125, ...
yukuai0011/elec5307-project-2-dataset-splited-public
2023-10-09T16:47:41.000Z
[ "region:us" ]
yukuai0011
null
null
0
0
2023-10-09T04:11:15
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': Apple '1': Avocado '2': Banana '3': Blueberry '4': Coconut '5': Cucumber '6': Dragon_fruit '7': Grape '8': Grapefruit '9': Kiwifruit '10': Lemon '11': Lychee '12': Mangoes '13': Orange '14': Papaya '15': Passion fruit '16': Peach '17': Pear '18': Pineapple '19': Pomegranate '20': Raspberry '21': Rockmelon '22': Strawberries '23': Tomato '24': Waterlemon splits: - name: train num_bytes: 270703771.307 num_examples: 2421 - name: test num_bytes: 63336528.0 num_examples: 605 download_size: 320028339 dataset_size: 334040299.307 --- # Dataset Card for "elec5307-project-2-dataset-splited-public" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
1,256
[ [ -0.048553466796875, -0.00391387939453125, 0.0078582763671875, 0.0284271240234375, -0.02703857421875, 0.0162200927734375, 0.00994110107421875, -0.0255126953125, 0.0640869140625, 0.04730224609375, -0.0643310546875, -0.048736572265625, -0.036895751953125, -0.00...
MattReiley/first
2023-10-09T04:19:43.000Z
[ "region:us" ]
MattReiley
null
null
0
0
2023-10-09T04:19:43
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
Zhongyuan/sd-dataset
2023-11-02T07:22:19.000Z
[ "region:us" ]
Zhongyuan
null
null
0
0
2023-10-09T04:28:47
Entry not found
15
[ [ -0.02142333984375, -0.014984130859375, 0.057220458984375, 0.0288238525390625, -0.03509521484375, 0.04656982421875, 0.052520751953125, 0.00506591796875, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060455322265625, 0.03793334...
Andyrasika/potholes-dataset
2023-10-09T04:43:06.000Z
[ "region:us" ]
Andyrasika
null
null
0
0
2023-10-09T04:43:01
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 26575443.0 num_examples: 350 - name: validation num_bytes: 2929769.0 num_examples: 34 - name: test num_bytes: 1442112.0 num_examples: 16 download_size: 30638600 dataset_size: 30947324.0 --- # Dataset Card for "potholes-dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
666
[ [ -0.03466796875, -0.025726318359375, 0.033111572265625, 0.03192138671875, -0.030029296875, 0.00759124755859375, 0.023162841796875, 0.002716064453125, 0.04144287109375, 0.044097900390625, -0.044036865234375, -0.056365966796875, -0.050872802734375, -0.026382446...
teowu/LLVisionQA-QBench
2023-10-13T19:24:55.000Z
[ "license:cc-by-nc-sa-4.0", "arxiv:2309.14181", "region:us" ]
teowu
null
null
0
0
2023-10-09T04:53:34
--- license: cc-by-nc-sa-4.0 --- Dataset for Paper: **Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision**. *Images*: `images.tar` `dev`-*labels*: `llvisionqa_dev.json` `test`-*labels*: `llvisionqa_test.json` See Github for Usage: https://github.com/vqassessment/q-bench. Feel free to cite us. ```bibtex @article{wu2023qbench, title={Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision}, author={Wu, Haoning and Zhang, Zicheng and Zhang, Erli and Chen, Chaofeng and Liao, Liang and Wang, Annan and Li, Chunyi and Sun, Wenxiu and Yan, Qiong and Zhai, Guangtao and Lin, Weisi}, year={2023}, eprint={2309.14181}, } ```
695
[ [ -0.0168304443359375, -0.002147674560546875, 0.0362548828125, 0.0012178421020507812, -0.0226898193359375, -0.028106689453125, 0.0221405029296875, -0.0394287109375, 0.00545501708984375, 0.036468505859375, -0.036712646484375, -0.049530029296875, -0.023956298828125,...
teowu/LLDescribe-QBench
2023-10-09T08:26:58.000Z
[ "license:cc-by-nc-sa-4.0", "arxiv:2309.14181", "region:us" ]
teowu
null
null
0
0
2023-10-09T04:54:58
--- license: cc-by-nc-sa-4.0 --- Dataset for Paper: **Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision**. See Github: https://github.com/vqassessment/q-bench. Feel free to cite us. ```bibtex @article{wu2023qbench, title={Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision}, author={Wu, Haoning and Zhang, Zicheng and Zhang, Erli and Chen, Chaofeng and Liao, Liang and Wang, Annan and Li, Chunyi and Sun, Wenxiu and Yan, Qiong and Zhai, Guangtao and Lin, Weisi}, year={2023}, eprint={2309.14181}, } ```
579
[ [ -0.01493072509765625, -0.00981903076171875, 0.028228759765625, 0.0033779144287109375, -0.0167694091796875, -0.03997802734375, 0.0225830078125, -0.04156494140625, 0.001739501953125, 0.03082275390625, -0.0361328125, -0.040985107421875, -0.01282501220703125, -0...
chunpingvi/tinystories_raw
2023-10-09T05:26:31.000Z
[ "region:us" ]
chunpingvi
null
null
0
0
2023-10-09T05:26:08
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
darcy01/customeDataSet
2023-10-09T06:08:47.000Z
[ "region:us" ]
darcy01
null
null
0
0
2023-10-09T05:48:05
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
open-llm-leaderboard/details_sequelbox__SharpBalance
2023-10-23T18:53:21.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-09T05:50:11
--- pretty_name: Evaluation run of sequelbox/SharpBalance dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [sequelbox/SharpBalance](https://huggingface.co/sequelbox/SharpBalance) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sequelbox__SharpBalance\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-23T18:53:09.205615](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__SharpBalance/blob/main/results_2023-10-23T18-53-09.205615.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.30861996644295303,\n\ \ \"em_stderr\": 0.00473053301508219,\n \"f1\": 0.3692638422818801,\n\ \ \"f1_stderr\": 0.004628079358040571,\n \"acc\": 0.5935214367393442,\n\ \ \"acc_stderr\": 0.011697898266884079\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.30861996644295303,\n \"em_stderr\": 0.00473053301508219,\n\ \ \"f1\": 0.3692638422818801,\n \"f1_stderr\": 0.004628079358040571\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3464746019711903,\n \ \ \"acc_stderr\": 0.013107179054313396\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n\ \ }\n}\n```" repo_url: https://huggingface.co/sequelbox/SharpBalance leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|arc:challenge|25_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-09T05-49-47.525988.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_23T18_53_09.205615 path: - '**/details_harness|drop|3_2023-10-23T18-53-09.205615.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-23T18-53-09.205615.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_23T18_53_09.205615 path: - '**/details_harness|gsm8k|5_2023-10-23T18-53-09.205615.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-23T18-53-09.205615.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hellaswag|10_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T05-49-47.525988.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T05-49-47.525988.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_09T05_49_47.525988 path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T05-49-47.525988.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T05-49-47.525988.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_23T18_53_09.205615 path: - '**/details_harness|winogrande|5_2023-10-23T18-53-09.205615.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-23T18-53-09.205615.parquet' - config_name: results data_files: - split: 2023_10_09T05_49_47.525988 path: - results_2023-10-09T05-49-47.525988.parquet - split: 2023_10_23T18_53_09.205615 path: - results_2023-10-23T18-53-09.205615.parquet - split: latest path: - results_2023-10-23T18-53-09.205615.parquet --- # Dataset Card for Evaluation run of sequelbox/SharpBalance ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/sequelbox/SharpBalance - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [sequelbox/SharpBalance](https://huggingface.co/sequelbox/SharpBalance) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_sequelbox__SharpBalance", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T18:53:09.205615](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__SharpBalance/blob/main/results_2023-10-23T18-53-09.205615.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.30861996644295303, "em_stderr": 0.00473053301508219, "f1": 0.3692638422818801, "f1_stderr": 0.004628079358040571, "acc": 0.5935214367393442, "acc_stderr": 0.011697898266884079 }, "harness|drop|3": { "em": 0.30861996644295303, "em_stderr": 0.00473053301508219, "f1": 0.3692638422818801, "f1_stderr": 0.004628079358040571 }, "harness|gsm8k|5": { "acc": 0.3464746019711903, "acc_stderr": 0.013107179054313396 }, "harness|winogrande|5": { "acc": 0.840568271507498, "acc_stderr": 0.010288617479454764 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
38,564
[ [ -0.027008056640625, -0.04736328125, 0.0094451904296875, 0.01007843017578125, -0.01019287109375, 0.017425537109375, -0.00783538818359375, -0.00046896934509277344, 0.01276397705078125, 0.038055419921875, -0.059600830078125, -0.06640625, -0.0498046875, 0.006591...
ap07/input_dataset
2023-10-09T05:56:32.000Z
[ "region:us" ]
ap07
null
null
0
0
2023-10-09T05:51:29
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
chunpingvi/tinystories
2023-10-09T06:09:19.000Z
[ "region:us" ]
chunpingvi
null
null
0
0
2023-10-09T06:09:02
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
darcy01/autotrain-data-hanzbydarcycao
2023-10-09T06:13:45.000Z
[ "task_categories:translation", "language:zh", "language:en", "region:us" ]
darcy01
null
null
0
0
2023-10-09T06:10:02
--- language: - zh - en task_categories: - translation --- # AutoTrain Dataset for project: hanzbydarcycao ## Dataset Description This dataset has been automatically processed by AutoTrain for project hanzbydarcycao. ### Languages The BCP-47 code for the dataset's language is zh2en. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "source": "sarashi", "target": "sarashi" }, { "source": "Dojo", "target": "Dojo" } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "source": "Value(dtype='string', id=None)", "target": "Value(dtype='string', id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 98 | | valid | 25 |
927
[ [ -0.024444580078125, 0.0105438232421875, 0.00872802734375, 0.011962890625, -0.027069091796875, 0.011322021484375, -0.015655517578125, -0.0263214111328125, -0.0006575584411621094, 0.0218505859375, -0.060302734375, -0.051055908203125, -0.03485107421875, 0.00572...
Falah/flowers_seed_prompts
2023-10-09T06:38:59.000Z
[ "region:us" ]
Falah
null
null
0
0
2023-10-09T06:38:57
--- dataset_info: features: - name: prompts dtype: string splits: - name: train num_bytes: 89719 num_examples: 1000 download_size: 2001 dataset_size: 89719 --- # Dataset Card for "flowers_seed_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
358
[ [ -0.0316162109375, -0.0212860107421875, 0.0253143310546875, 0.048431396484375, -0.002094268798828125, -0.0201568603515625, 0.00791168212890625, -0.0005426406860351562, 0.043365478515625, 0.01220703125, -0.08782958984375, -0.05145263671875, -0.0399169921875, 0...
mychen76/ShareGPT_V3_unfiltered_cleaned_small_9k
2023-10-09T06:56:38.000Z
[ "region:us" ]
mychen76
null
null
0
0
2023-10-09T06:49:29
--- dataset_info: features: - name: id dtype: string - name: conversations list: - name: from dtype: string - name: markdown struct: - name: answer dtype: string - name: index dtype: int64 - name: type dtype: string - name: text dtype: string - name: value dtype: string splits: - name: train num_bytes: 57188795.51333581 num_examples: 8473 - name: test num_bytes: 6358060.35330607 num_examples: 942 - name: valid num_bytes: 641205.6619576185 num_examples: 95 download_size: 28307098 dataset_size: 64188061.5285995 --- # Dataset Card for "ShareGPT_V3_unfiltered_cleaned_small_9k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
844
[ [ -0.0423583984375, -0.006320953369140625, 0.024688720703125, 0.00713348388671875, -0.0450439453125, -0.007709503173828125, 0.00949859619140625, -0.00616455078125, 0.055328369140625, 0.047210693359375, -0.06005859375, -0.041229248046875, -0.046966552734375, -0...
open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B
2023-10-09T06:56:38.000Z
[ "region:us" ]
open-llm-leaderboard
null
null
0
0
2023-10-09T06:55:37
--- pretty_name: Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [krevas/LDCC-Instruct-Llama-2-ko-13B](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-10-09T06:55:19.126017](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B/blob/main/results_2023-10-09T06-55-19.126017.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5140887884293746,\n\ \ \"acc_stderr\": 0.034831195333324204,\n \"acc_norm\": 0.5180581384469735,\n\ \ \"acc_norm_stderr\": 0.03481277047428223,\n \"mc1\": 0.26193390452876375,\n\ \ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.37999611805412853,\n\ \ \"mc2_stderr\": 0.013428724763055466\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636588,\n\ \ \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182526\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6096395140410277,\n\ \ \"acc_stderr\": 0.004868341056566223,\n \"acc_norm\": 0.8156741684923322,\n\ \ \"acc_norm_stderr\": 0.0038695723555438196\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\ \ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\ \ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\ \ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n\ \ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\ \ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\ \ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\ \ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\ \ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\ \ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714506,\n\ \ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714506\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\ \ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\ \ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\ \ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\ \ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\ \ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"\ acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\ \ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\ \ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5935483870967742,\n\ \ \"acc_stderr\": 0.027941727346256304,\n \"acc_norm\": 0.5935483870967742,\n\ \ \"acc_norm_stderr\": 0.027941727346256304\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\ \ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\ : 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\ \ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6262626262626263,\n \"acc_stderr\": 0.034468977386593325,\n \"\ acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.034468977386593325\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\ \ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4461538461538462,\n \"acc_stderr\": 0.02520357177302833,\n \ \ \"acc_norm\": 0.4461538461538462,\n \"acc_norm_stderr\": 0.02520357177302833\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \ \ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.0324371805513741,\n \ \ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.0324371805513741\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\ acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.6678899082568808,\n \"acc_stderr\": 0.02019268298542333,\n \"\ acc_norm\": 0.6678899082568808,\n \"acc_norm_stderr\": 0.02019268298542333\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936484,\n \"\ acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936484\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"\ acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \ \ \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\ \ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\ \ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\ \ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\ acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\ \ \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n\ \ \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\ \ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\ \ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\ \ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\ \ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\ \ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\ \ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \ \ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n\ \ \"acc_stderr\": 0.016203792703197776,\n \"acc_norm\": 0.7113665389527458,\n\ \ \"acc_norm_stderr\": 0.016203792703197776\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5520231213872833,\n \"acc_stderr\": 0.02677299065336182,\n\ \ \"acc_norm\": 0.5520231213872833,\n \"acc_norm_stderr\": 0.02677299065336182\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\ \ \"acc_stderr\": 0.014465893829859924,\n \"acc_norm\": 0.24916201117318434,\n\ \ \"acc_norm_stderr\": 0.014465893829859924\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n\ \ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\ \ \"acc_stderr\": 0.027316847674192703,\n \"acc_norm\": 0.6366559485530546,\n\ \ \"acc_norm_stderr\": 0.027316847674192703\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380157,\n\ \ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380157\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \ \ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n\ \ \"acc_stderr\": 0.012564871542534353,\n \"acc_norm\": 0.4106910039113429,\n\ \ \"acc_norm_stderr\": 0.012564871542534353\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.0302114796091216,\n\ \ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.0302114796091216\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181357,\n \ \ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181357\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\ \ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\ \ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\ \ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\ \ \"acc_stderr\": 0.03220024104534204,\n \"acc_norm\": 0.7064676616915423,\n\ \ \"acc_norm_stderr\": 0.03220024104534204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\ \ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\ \ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\ \ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\ \ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.37999611805412853,\n\ \ \"mc2_stderr\": 0.013428724763055466\n }\n}\n```" repo_url: https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|arc:challenge|25_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hellaswag|10_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T06-55-19.126017.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_09T06_55_19.126017 path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T06-55-19.126017.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-09T06-55-19.126017.parquet' - config_name: results data_files: - split: 2023_10_09T06_55_19.126017 path: - results_2023-10-09T06-55-19.126017.parquet - split: latest path: - results_2023-10-09T06-55-19.126017.parquet --- # Dataset Card for Evaluation run of krevas/LDCC-Instruct-Llama-2-ko-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [krevas/LDCC-Instruct-Llama-2-ko-13B](https://huggingface.co/krevas/LDCC-Instruct-Llama-2-ko-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-09T06:55:19.126017](https://huggingface.co/datasets/open-llm-leaderboard/details_krevas__LDCC-Instruct-Llama-2-ko-13B/blob/main/results_2023-10-09T06-55-19.126017.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5140887884293746, "acc_stderr": 0.034831195333324204, "acc_norm": 0.5180581384469735, "acc_norm_stderr": 0.03481277047428223, "mc1": 0.26193390452876375, "mc1_stderr": 0.01539211880501503, "mc2": 0.37999611805412853, "mc2_stderr": 0.013428724763055466 }, "harness|arc:challenge|25": { "acc": 0.5392491467576792, "acc_stderr": 0.014566303676636588, "acc_norm": 0.5674061433447098, "acc_norm_stderr": 0.014478005694182526 }, "harness|hellaswag|10": { "acc": 0.6096395140410277, "acc_stderr": 0.004868341056566223, "acc_norm": 0.8156741684923322, "acc_norm_stderr": 0.0038695723555438196 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4666666666666667, "acc_stderr": 0.043097329010363554, "acc_norm": 0.4666666666666667, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5197368421052632, "acc_stderr": 0.04065771002562605, "acc_norm": 0.5197368421052632, "acc_norm_stderr": 0.04065771002562605 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5358490566037736, "acc_stderr": 0.030693675018458003, "acc_norm": 0.5358490566037736, "acc_norm_stderr": 0.030693675018458003 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04122728707651282, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04122728707651282 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.44508670520231214, "acc_stderr": 0.03789401760283647, "acc_norm": 0.44508670520231214, "acc_norm_stderr": 0.03789401760283647 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.042207736591714506, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.042207736591714506 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.42127659574468085, "acc_stderr": 0.03227834510146268, "acc_norm": 0.42127659574468085, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.0433913832257986, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.0433913832257986 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.42758620689655175, "acc_stderr": 0.041227371113703316, "acc_norm": 0.42758620689655175, "acc_norm_stderr": 0.041227371113703316 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3201058201058201, "acc_stderr": 0.024026846392873506, "acc_norm": 0.3201058201058201, "acc_norm_stderr": 0.024026846392873506 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.04134913018303316, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.04134913018303316 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5935483870967742, "acc_stderr": 0.027941727346256304, "acc_norm": 0.5935483870967742, "acc_norm_stderr": 0.027941727346256304 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3448275862068966, "acc_stderr": 0.03344283744280458, "acc_norm": 0.3448275862068966, "acc_norm_stderr": 0.03344283744280458 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6303030303030303, "acc_stderr": 0.03769430314512567, "acc_norm": 0.6303030303030303, "acc_norm_stderr": 0.03769430314512567 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6262626262626263, "acc_stderr": 0.034468977386593325, "acc_norm": 0.6262626262626263, "acc_norm_stderr": 0.034468977386593325 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7305699481865285, "acc_stderr": 0.03201867122877794, "acc_norm": 0.7305699481865285, "acc_norm_stderr": 0.03201867122877794 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4461538461538462, "acc_stderr": 0.02520357177302833, "acc_norm": 0.4461538461538462, "acc_norm_stderr": 0.02520357177302833 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066485, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5252100840336135, "acc_stderr": 0.0324371805513741, "acc_norm": 0.5252100840336135, "acc_norm_stderr": 0.0324371805513741 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2582781456953642, "acc_stderr": 0.035737053147634576, "acc_norm": 0.2582781456953642, "acc_norm_stderr": 0.035737053147634576 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6678899082568808, "acc_stderr": 0.02019268298542333, "acc_norm": 0.6678899082568808, "acc_norm_stderr": 0.02019268298542333 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.032568505702936484, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.032568505702936484 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03308611113236436, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03308611113236436 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6919831223628692, "acc_stderr": 0.0300523893356057, "acc_norm": 0.6919831223628692, "acc_norm_stderr": 0.0300523893356057 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.600896860986547, "acc_stderr": 0.03286745312567961, "acc_norm": 0.600896860986547, "acc_norm_stderr": 0.03286745312567961 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5801526717557252, "acc_stderr": 0.043285772152629715, "acc_norm": 0.5801526717557252, "acc_norm_stderr": 0.043285772152629715 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6859504132231405, "acc_stderr": 0.04236964753041018, "acc_norm": 0.6859504132231405, "acc_norm_stderr": 0.04236964753041018 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6018518518518519, "acc_stderr": 0.04732332615978813, "acc_norm": 0.6018518518518519, "acc_norm_stderr": 0.04732332615978813 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.656441717791411, "acc_stderr": 0.037311335196738925, "acc_norm": 0.656441717791411, "acc_norm_stderr": 0.037311335196738925 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4017857142857143, "acc_stderr": 0.04653333146973646, "acc_norm": 0.4017857142857143, "acc_norm_stderr": 0.04653333146973646 }, "harness|hendrycksTest-management|5": { "acc": 0.6601941747572816, "acc_stderr": 0.046897659372781335, "acc_norm": 0.6601941747572816, "acc_norm_stderr": 0.046897659372781335 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7991452991452992, "acc_stderr": 0.02624677294689048, "acc_norm": 0.7991452991452992, "acc_norm_stderr": 0.02624677294689048 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7113665389527458, "acc_stderr": 0.016203792703197776, "acc_norm": 0.7113665389527458, "acc_norm_stderr": 0.016203792703197776 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5520231213872833, "acc_stderr": 0.02677299065336182, "acc_norm": 0.5520231213872833, "acc_norm_stderr": 0.02677299065336182 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24916201117318434, "acc_stderr": 0.014465893829859924, "acc_norm": 0.24916201117318434, "acc_norm_stderr": 0.014465893829859924 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5359477124183006, "acc_stderr": 0.02855582751652878, "acc_norm": 0.5359477124183006, "acc_norm_stderr": 0.02855582751652878 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6366559485530546, "acc_stderr": 0.027316847674192703, "acc_norm": 0.6366559485530546, "acc_norm_stderr": 0.027316847674192703 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6265432098765432, "acc_stderr": 0.026915003011380157, "acc_norm": 0.6265432098765432, "acc_norm_stderr": 0.026915003011380157 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3617021276595745, "acc_stderr": 0.028663820147199492, "acc_norm": 0.3617021276595745, "acc_norm_stderr": 0.028663820147199492 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4106910039113429, "acc_stderr": 0.012564871542534353, "acc_norm": 0.4106910039113429, "acc_norm_stderr": 0.012564871542534353 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4485294117647059, "acc_stderr": 0.0302114796091216, "acc_norm": 0.4485294117647059, "acc_norm_stderr": 0.0302114796091216 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.553921568627451, "acc_stderr": 0.020109864547181357, "acc_norm": 0.553921568627451, "acc_norm_stderr": 0.020109864547181357 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5909090909090909, "acc_stderr": 0.04709306978661896, "acc_norm": 0.5909090909090909, "acc_norm_stderr": 0.04709306978661896 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5346938775510204, "acc_stderr": 0.03193207024425314, "acc_norm": 0.5346938775510204, "acc_norm_stderr": 0.03193207024425314 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7064676616915423, "acc_stderr": 0.03220024104534204, "acc_norm": 0.7064676616915423, "acc_norm_stderr": 0.03220024104534204 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.463855421686747, "acc_stderr": 0.03882310850890593, "acc_norm": 0.463855421686747, "acc_norm_stderr": 0.03882310850890593 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.03158149539338734, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.26193390452876375, "mc1_stderr": 0.01539211880501503, "mc2": 0.37999611805412853, "mc2_stderr": 0.013428724763055466 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
65,047
[ [ -0.0489501953125, -0.05810546875, 0.019500732421875, 0.0157318115234375, -0.014129638671875, -0.0002129077911376953, 0.0031108856201171875, -0.017730712890625, 0.038818359375, -0.00420379638671875, -0.033905029296875, -0.0472412109375, -0.032379150390625, 0....
ChanHE/score_112_text
2023-10-09T07:11:11.000Z
[ "region:us" ]
ChanHE
null
null
0
0
2023-10-09T07:09:57
Entry not found
15
[ [ -0.02142333984375, -0.01495361328125, 0.05718994140625, 0.0288238525390625, -0.035064697265625, 0.046539306640625, 0.052520751953125, 0.005062103271484375, 0.0513916015625, 0.016998291015625, -0.052093505859375, -0.014984130859375, -0.060394287109375, 0.0379...
Falah/night_time_prompts
2023-10-09T07:18:42.000Z
[ "region:us" ]
Falah
null
null
0
0
2023-10-09T07:18:41
--- dataset_info: features: - name: prompts dtype: string splits: - name: train num_bytes: 706 num_examples: 5 download_size: 1559 dataset_size: 706 --- # Dataset Card for "night_time_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
349
[ [ -0.031494140625, -0.0235137939453125, 0.0268707275390625, 0.0309906005859375, -0.0228424072265625, 0.0063018798828125, 0.01290130615234375, -0.0125732421875, 0.054595947265625, 0.03228759765625, -0.07061767578125, -0.04888916015625, -0.0189666748046875, -0.0...
Falah/kids_coloring_book_prompts
2023-10-09T08:09:10.000Z
[ "region:us" ]
Falah
null
null
0
0
2023-10-09T08:04:42
--- dataset_info: features: - name: prompts dtype: string splits: - name: train num_bytes: 287035 num_examples: 3000 download_size: 4140 dataset_size: 287035 --- # Dataset Card for "kids_coloring_book_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
366
[ [ -0.04327392578125, -0.00872039794921875, 0.004390716552734375, 0.0199737548828125, -0.0193023681640625, 0.001132965087890625, 0.018096923828125, -0.0017032623291015625, 0.0357666015625, 0.0178375244140625, -0.08843994140625, -0.047821044921875, -0.03396606445312...
debthedev/Benglish_LLM_dataset
2023-10-09T08:19:12.000Z
[ "region:us" ]
debthedev
null
null
0
0
2023-10-09T08:19:12
Entry not found
15
[ [ -0.021392822265625, -0.01494598388671875, 0.05718994140625, 0.028839111328125, -0.0350341796875, 0.046539306640625, 0.052490234375, 0.00507354736328125, 0.051361083984375, 0.01702880859375, -0.052093505859375, -0.01494598388671875, -0.06036376953125, 0.03790...
chiayewken/m3exam
2023-10-09T08:38:11.000Z
[ "arxiv:2306.05179", "region:us" ]
chiayewken
null
null
0
0
2023-10-09T08:32:21
--- dataset_info: - config_name: afrikaans features: - name: question_text dtype: string - name: background dtype: string - name: answer_text dtype: string - name: options sequence: string - name: language dtype: string - name: level dtype: string - name: subject dtype: string - name: subject_category dtype: string splits: - name: dev num_bytes: 8860 num_examples: 25 - name: test num_bytes: 194333 num_examples: 258 download_size: 71295 dataset_size: 203193 - config_name: chinese features: - name: question_text dtype: string - name: background dtype: string - name: answer_text dtype: string - name: options sequence: string - name: language dtype: string - name: level dtype: string - name: subject dtype: string - name: subject_category dtype: string splits: - name: dev num_bytes: 25055 num_examples: 29 - name: test num_bytes: 485093 num_examples: 682 download_size: 289255 dataset_size: 510148 - config_name: english features: - name: question_text dtype: string - name: background dtype: string - name: answer_text dtype: string - name: options sequence: string - name: language dtype: string - name: level dtype: string - name: subject dtype: string - name: subject_category dtype: string splits: - name: dev num_bytes: 12792 num_examples: 32 - name: test num_bytes: 2573796 num_examples: 1911 download_size: 697219 dataset_size: 2586588 - config_name: italian features: - name: question_text dtype: string - name: background dtype: string - name: answer_text dtype: string - name: options sequence: string - name: language dtype: string - name: level dtype: string - name: subject dtype: string - name: subject_category dtype: string splits: - name: dev num_bytes: 5834 num_examples: 18 - name: test num_bytes: 2397963 num_examples: 811 download_size: 326671 dataset_size: 2403797 - config_name: javanese features: - name: question_text dtype: string - name: background dtype: string - name: answer_text dtype: string - name: options sequence: string - name: language dtype: string - name: level dtype: string - name: subject dtype: string - name: subject_category dtype: string splits: - name: dev num_bytes: 1425 num_examples: 6 - name: test num_bytes: 187280 num_examples: 371 download_size: 84085 dataset_size: 188705 - config_name: portuguese features: - name: question_text dtype: string - name: background dtype: string - name: answer_text dtype: string - name: options sequence: string - name: language dtype: string - name: level dtype: string - name: subject dtype: string - name: subject_category dtype: string splits: - name: dev num_bytes: 20979 num_examples: 24 - name: test num_bytes: 941655 num_examples: 889 download_size: 614816 dataset_size: 962634 - config_name: swahili features: - name: question_text dtype: string - name: background dtype: string - name: answer_text dtype: string - name: options sequence: string - name: language dtype: string - name: level dtype: string - name: subject dtype: string - name: subject_category dtype: string splits: - name: dev num_bytes: 2053 num_examples: 6 - name: test num_bytes: 607215 num_examples: 428 download_size: 94031 dataset_size: 609268 - config_name: thai features: - name: question_text dtype: string - name: background dtype: string - name: answer_text dtype: string - name: options sequence: string - name: language dtype: string - name: level dtype: string - name: subject dtype: string - name: subject_category dtype: string splits: - name: dev num_bytes: 16185 num_examples: 26 - name: test num_bytes: 2249737 num_examples: 2168 download_size: 901256 dataset_size: 2265922 - config_name: vietnamese features: - name: question_text dtype: string - name: background dtype: string - name: answer_text dtype: string - name: options sequence: string - name: language dtype: string - name: level dtype: string - name: subject dtype: string - name: subject_category dtype: string splits: - name: dev num_bytes: 7974 num_examples: 28 - name: test num_bytes: 767759 num_examples: 1789 download_size: 375774 dataset_size: 775733 configs: - config_name: afrikaans data_files: - split: dev path: afrikaans/dev-* - split: test path: afrikaans/test-* - config_name: chinese data_files: - split: dev path: chinese/dev-* - split: test path: chinese/test-* - config_name: english data_files: - split: dev path: english/dev-* - split: test path: english/test-* - config_name: italian data_files: - split: dev path: italian/dev-* - split: test path: italian/test-* - config_name: javanese data_files: - split: dev path: javanese/dev-* - split: test path: javanese/test-* - config_name: portuguese data_files: - split: dev path: portuguese/dev-* - split: test path: portuguese/test-* - config_name: swahili data_files: - split: dev path: swahili/dev-* - split: test path: swahili/test-* - config_name: thai data_files: - split: dev path: thai/dev-* - split: test path: thai/test-* - config_name: vietnamese data_files: - split: dev path: vietnamese/dev-* - split: test path: vietnamese/test-* --- # M3Exam: A Multilingual 🌏, Multimodal 🖼, Multilevel 📈 Benchmark for LLMs This is the repository for [M3Exam: A Multilingual, Multimodal, Multilevel Benchmark for Examining Large Language Models](https://arxiv.org/abs/2306.05179/). TL;DR: We introduce M3Exam, a novel benchmark sourced from real and official human exam questions for evaluating LLMs in a multilingual, multimodal, and multilevel context. ![image](https://github.com/DAMO-NLP-SG/M3Exam/blob/main/images/m3exam-examples.jpg?raw=true)
6,281
[ [ -0.0394287109375, -0.052734375, 0.03326416015625, 0.020782470703125, -0.00522613525390625, 0.0073699951171875, -0.0007390975952148438, -0.02716064453125, -0.0114593505859375, 0.0211944580078125, -0.046173095703125, -0.058990478515625, -0.0377197265625, 0.009...