datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
Muthuchancoach/TG_QnA
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 39973 num_examples: 158 download_size: 10001 dataset_size: 39973 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "TG_QnA" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
income/cqadupstack-unix-top-20-gen-queries
--- annotations_creators: [] language_creators: [] language: - en license: - cc-by-sa-4.0 multilinguality: - monolingual paperswithcode_id: beir pretty_name: BEIR Benchmark size_categories: msmarco: - 1M<n<10M trec-covid: - 100k<n<1M nfcorpus: - 1K<n<10K nq: - 1M<n<10M hotpotqa: - 1M<n<10M fiqa: - 10K<n<100K arguana: - 1K<n<10K touche-2020: - 100K<n<1M cqadupstack: - 100K<n<1M quora: - 100K<n<1M dbpedia: - 1M<n<10M scidocs: - 10K<n<100K fever: - 1M<n<10M climate-fever: - 1M<n<10M scifact: - 1K<n<10K source_datasets: [] task_categories: - text-retrieval --- # NFCorpus: 20 generated queries (BEIR Benchmark) This HF dataset contains the top-20 synthetic queries generated for each passage in the above BEIR benchmark dataset. - DocT5query model used: [BeIR/query-gen-msmarco-t5-base-v1](https://huggingface.co/BeIR/query-gen-msmarco-t5-base-v1) - id (str): unique document id in NFCorpus in the BEIR benchmark (`corpus.jsonl`). - Questions generated: 20 - Code used for generation: [evaluate_anserini_docT5query_parallel.py](https://github.com/beir-cellar/beir/blob/main/examples/retrieval/evaluation/sparse/evaluate_anserini_docT5query_parallel.py) Below contains the old dataset card for the BEIR benchmark. # Dataset Card for BEIR Benchmark ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://github.com/UKPLab/beir - **Repository:** https://github.com/UKPLab/beir - **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ - **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns - **Point of Contact:** nandan.thakur@uwaterloo.ca ### Dataset Summary BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks: - Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact) - Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/) - Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) - News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html) - Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data) - Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) - Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs) - Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html) - Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/) All these datasets have been preprocessed and can be used for your experiments. ```python ``` ### Supported Tasks and Leaderboards The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia. The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/). ### Languages All tasks are in English (`en`). ## Dataset Structure All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format: - `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}` - `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}` - `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1` ### Data Instances A high level example of any beir dataset: ```python corpus = { "doc1" : { "title": "Albert Einstein", "text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \ one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \ its influence on the philosophy of science. He is best known to the general public for his mass–energy \ equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \ Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \ of the photoelectric effect', a pivotal step in the development of quantum theory." }, "doc2" : { "title": "", # Keep title an empty string if not present "text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \ malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\ with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)." }, } queries = { "q1" : "Who developed the mass-energy equivalence formula?", "q2" : "Which beer is brewed with a large proportion of wheat?" } qrels = { "q1" : {"doc1": 1}, "q2" : {"doc2": 1}, } ``` ### Data Fields Examples from all configurations have the following features: ### Corpus - `corpus`: a `dict` feature representing the document title and passage text, made up of: - `_id`: a `string` feature representing the unique document id - `title`: a `string` feature, denoting the title of the document. - `text`: a `string` feature, denoting the text of the document. ### Queries - `queries`: a `dict` feature representing the query, made up of: - `_id`: a `string` feature representing the unique query id - `text`: a `string` feature, denoting the text of the query. ### Qrels - `qrels`: a `dict` feature representing the query document relevance judgements, made up of: - `_id`: a `string` feature representing the query id - `_id`: a `string` feature, denoting the document id. - `score`: a `int32` feature, denoting the relevance judgement between query and document. ### Data Splits | Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 | | -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:| | MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` | | TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` | | NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` | | BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) | | NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` | | HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` | | FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` | | Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) | | TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) | | ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` | | Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` | | CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` | | Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` | | DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` | | SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` | | FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` | | Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` | | SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` | | Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) | ## Dataset Creation ### Curation Rationale [Needs More Information] ### Source Data #### Initial Data Collection and Normalization [Needs More Information] #### Who are the source language producers? [Needs More Information] ### Annotations #### Annotation process [Needs More Information] #### Who are the annotators? [Needs More Information] ### Personal and Sensitive Information [Needs More Information] ## Considerations for Using the Data ### Social Impact of Dataset [Needs More Information] ### Discussion of Biases [Needs More Information] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators [Needs More Information] ### Licensing Information [Needs More Information] ### Citation Information Cite as: ``` @inproceedings{ thakur2021beir, title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models}, author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych}, booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)}, year={2021}, url={https://openreview.net/forum?id=wCu6T5xFjeJ} } ``` ### Contributions Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.Top-20 generated queries for every passage in NFCorpus # Dataset Card for BEIR Benchmark ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://github.com/UKPLab/beir - **Repository:** https://github.com/UKPLab/beir - **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ - **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns - **Point of Contact:** nandan.thakur@uwaterloo.ca ### Dataset Summary BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks: - Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact) - Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/) - Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) - News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html) - Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data) - Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) - Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs) - Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html) - Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/) All these datasets have been preprocessed and can be used for your experiments. ```python ``` ### Supported Tasks and Leaderboards The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia. The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/). ### Languages All tasks are in English (`en`). ## Dataset Structure All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format: - `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}` - `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}` - `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1` ### Data Instances A high level example of any beir dataset: ```python corpus = { "doc1" : { "title": "Albert Einstein", "text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \ one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \ its influence on the philosophy of science. He is best known to the general public for his mass–energy \ equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \ Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \ of the photoelectric effect', a pivotal step in the development of quantum theory." }, "doc2" : { "title": "", # Keep title an empty string if not present "text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \ malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\ with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)." }, } queries = { "q1" : "Who developed the mass-energy equivalence formula?", "q2" : "Which beer is brewed with a large proportion of wheat?" } qrels = { "q1" : {"doc1": 1}, "q2" : {"doc2": 1}, } ``` ### Data Fields Examples from all configurations have the following features: ### Corpus - `corpus`: a `dict` feature representing the document title and passage text, made up of: - `_id`: a `string` feature representing the unique document id - `title`: a `string` feature, denoting the title of the document. - `text`: a `string` feature, denoting the text of the document. ### Queries - `queries`: a `dict` feature representing the query, made up of: - `_id`: a `string` feature representing the unique query id - `text`: a `string` feature, denoting the text of the query. ### Qrels - `qrels`: a `dict` feature representing the query document relevance judgements, made up of: - `_id`: a `string` feature representing the query id - `_id`: a `string` feature, denoting the document id. - `score`: a `int32` feature, denoting the relevance judgement between query and document. ### Data Splits | Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 | | -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:| | MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` | | TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` | | NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` | | BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) | | NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` | | HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` | | FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` | | Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) | | TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) | | ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` | | Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` | | CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` | | Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` | | DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` | | SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` | | FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` | | Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` | | SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` | | Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) | ## Dataset Creation ### Curation Rationale [Needs More Information] ### Source Data #### Initial Data Collection and Normalization [Needs More Information] #### Who are the source language producers? [Needs More Information] ### Annotations #### Annotation process [Needs More Information] #### Who are the annotators? [Needs More Information] ### Personal and Sensitive Information [Needs More Information] ## Considerations for Using the Data ### Social Impact of Dataset [Needs More Information] ### Discussion of Biases [Needs More Information] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators [Needs More Information] ### Licensing Information [Needs More Information] ### Citation Information Cite as: ``` @inproceedings{ thakur2021beir, title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models}, author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych}, booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)}, year={2021}, url={https://openreview.net/forum?id=wCu6T5xFjeJ} } ``` ### Contributions Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.
ThWu/mmlu_train_test_split
--- dataset_info: features: - name: input dtype: string - name: A dtype: string - name: B dtype: string - name: C dtype: string - name: D dtype: string - name: target dtype: string - name: topic dtype: string - name: question_id dtype: int64 splits: - name: train num_bytes: 6984551 num_examples: 14042 - name: test num_bytes: 765372 num_examples: 1531 download_size: 4269240 dataset_size: 7749923 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
Bingsu/some_corpus
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 11307919646 num_examples: 12932421 download_size: 1089008611 dataset_size: 11307919646 --- # Dataset Card for "some_corpus" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ai-aerospace/ams_data_train_generic_v0.1_100
--- license: apache-2.0 base-model: TheBloke/Llama-2-7B-Chat-GGUF --- Question and answer pairs for the first 100 entries of aerospace mechanism symposia 5000 word chunk entries. Full file of entries is here: https://github.com/dsmueller3760/aerospace_chatbot/blob/llm_training/data/AMS/ams_data_answers.jsonl See this repository for details: https://github.com/dsmueller3760/aerospace_chatbot/tree/main Prompts generated using TheBloke/Llama-2-7B-Chat-GGUF
rescer/twitter_dataset_1713222816
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 202139 num_examples: 638 download_size: 115925 dataset_size: 202139 configs: - config_name: default data_files: - split: train path: data/train-* ---
delphi-suite/v0-next-logprobs-llama2-400k
--- dataset_info: features: - name: logprobs sequence: float64 splits: - name: validation num_bytes: 45818277 num_examples: 10982 download_size: 37682254 dataset_size: 45818277 configs: - config_name: default data_files: - split: validation path: data/validation-* ---
Disfluency/disfluency-es-16k
--- dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 - name: transcription dtype: string splits: - name: train num_bytes: 21315396.0 num_examples: 270 - name: test num_bytes: 1731088.0 num_examples: 30 download_size: 21061943 dataset_size: 23046484.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* --- # Dataset Card for "disfluency-es-16k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vishnusr/code_searchnet_reduced_val
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: 'Unnamed: 0.1' dtype: int64 - name: 'Unnamed: 0' dtype: int64 - name: code dtype: string - name: docstring dtype: string - name: prompt dtype: string splits: - name: train num_bytes: 1078734 num_examples: 500 download_size: 483209 dataset_size: 1078734 --- # Dataset Card for "code_searchnet_reduced_val" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
GIZ/vulnerability_training_data_full
--- task_categories: - text-classification language: - en tags: - climate pretty_name: Vulnerability training data ---
delayedkarma/langchain-issues
--- license: apache-2.0 --- ### Dataset Card for LangChain Issues #### Dataset Summary LangChain Issues is a dataset consisting of LangChain issues and pull requests associated with the LangChain repository (https://github.com/langchain-ai/langchain). It is intended for educational purposes and can be used for semantic search or multilabel text classification. The contents of each LangChain issue are in English and concern the domain of datasets for NLP, computer vision, and beyond.
12Manman12/translate-en-to-bclvrc
--- language: - en tags: - code ---
nielsr/datacomp_small_english_captions_with_weird_characters
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: uid dtype: string - name: url dtype: string - name: text dtype: string - name: original_width dtype: int64 - name: original_height dtype: int64 - name: clip_b32_similarity_score dtype: float32 - name: clip_l14_similarity_score dtype: float32 - name: face_bboxes sequence: sequence: float64 - name: sha256 dtype: string - name: detected_language dtype: string splits: - name: train num_bytes: 96674791.78677922 num_examples: 301183 download_size: 82472583 dataset_size: 96674791.78677922 --- # Dataset Card for "datacomp_small_english_captions_with_weird_characters" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2
--- pretty_name: Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge-Variant2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [BarryFutureman/WestLakeX-7B-EvoMerge-Variant2](https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge-Variant2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-02T03:18:15.694379](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2/blob/main/results_2024-02-02T03-18-15.694379.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538297853321007,\n\ \ \"acc_stderr\": 0.0320522890373237,\n \"acc_norm\": 0.6530566018124656,\n\ \ \"acc_norm_stderr\": 0.03272874681048371,\n \"mc1\": 0.5618115055079559,\n\ \ \"mc1_stderr\": 0.01736923616440441,\n \"mc2\": 0.7034639754228852,\n\ \ \"mc2_stderr\": 0.014889031021791599\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403511,\n\ \ \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7144991037641903,\n\ \ \"acc_stderr\": 0.0045072961962278075,\n \"acc_norm\": 0.8851822346146186,\n\ \ \"acc_norm_stderr\": 0.003181503506054323\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\ \ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n\ \ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\ \ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\ \ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\ \ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\ \ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\ \ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\ \ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\ \ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\ \ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\ \ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\ \ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\ \ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\ acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\ \ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\ \ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\ acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\ acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\ acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\ \ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \ \ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725197,\n \ \ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725197\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\ acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\ acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\ acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \ \ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\ \ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\ acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\ \ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\ \ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\ \ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\ \ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\ \ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\ \ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\ \ \"acc_stderr\": 0.01651959427529712,\n \"acc_norm\": 0.4223463687150838,\n\ \ \"acc_norm_stderr\": 0.01651959427529712\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n\ \ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\ \ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\ \ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\ \ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \ \ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\ \ \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n\ \ \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\ \ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \ \ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\ \ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\ \ \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n\ \ \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \ \ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\ \ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\ \ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\ \ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n\ \ \"mc1_stderr\": 0.01736923616440441,\n \"mc2\": 0.7034639754228852,\n\ \ \"mc2_stderr\": 0.014889031021791599\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8579321231254933,\n \"acc_stderr\": 0.009812000391679369\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6830932524639879,\n \ \ \"acc_stderr\": 0.012815868296721353\n }\n}\n```" repo_url: https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge-Variant2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|arc:challenge|25_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-02T03-18-15.694379.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|gsm8k|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hellaswag|10_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-18-15.694379.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-management|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-18-15.694379.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|truthfulqa:mc|0_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-02T03-18-15.694379.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_02T03_18_15.694379 path: - '**/details_harness|winogrande|5_2024-02-02T03-18-15.694379.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-02T03-18-15.694379.parquet' - config_name: results data_files: - split: 2024_02_02T03_18_15.694379 path: - results_2024-02-02T03-18-15.694379.parquet - split: latest path: - results_2024-02-02T03-18-15.694379.parquet --- # Dataset Card for Evaluation run of BarryFutureman/WestLakeX-7B-EvoMerge-Variant2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [BarryFutureman/WestLakeX-7B-EvoMerge-Variant2](https://huggingface.co/BarryFutureman/WestLakeX-7B-EvoMerge-Variant2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T03:18:15.694379](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WestLakeX-7B-EvoMerge-Variant2/blob/main/results_2024-02-02T03-18-15.694379.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6538297853321007, "acc_stderr": 0.0320522890373237, "acc_norm": 0.6530566018124656, "acc_norm_stderr": 0.03272874681048371, "mc1": 0.5618115055079559, "mc1_stderr": 0.01736923616440441, "mc2": 0.7034639754228852, "mc2_stderr": 0.014889031021791599 }, "harness|arc:challenge|25": { "acc": 0.7081911262798635, "acc_stderr": 0.013284525292403511, "acc_norm": 0.7252559726962458, "acc_norm_stderr": 0.013044617212771227 }, "harness|hellaswag|10": { "acc": 0.7144991037641903, "acc_stderr": 0.0045072961962278075, "acc_norm": 0.8851822346146186, "acc_norm_stderr": 0.003181503506054323 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720386, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720386 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.0373852067611967, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.0373852067611967 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544064, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544064 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.025559920550531003, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.025559920550531003 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723295, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.02385479568097112, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.02385479568097112 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.02925290592725197, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.02925290592725197 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886786, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886786 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49074074074074076, "acc_stderr": 0.034093869469927006, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.02023714900899093, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.02023714900899093 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368983, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368983 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069356, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069356 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4223463687150838, "acc_stderr": 0.01651959427529712, "acc_norm": 0.4223463687150838, "acc_norm_stderr": 0.01651959427529712 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7058823529411765, "acc_stderr": 0.02609016250427905, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.02609016250427905 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.02447722285613511, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.02447722285613511 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4667535853976532, "acc_stderr": 0.012741974333897227, "acc_norm": 0.4667535853976532, "acc_norm_stderr": 0.012741974333897227 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6748366013071896, "acc_stderr": 0.018950886770806315, "acc_norm": 0.6748366013071896, "acc_norm_stderr": 0.018950886770806315 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128448, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128448 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.02740385941078685, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.02740385941078685 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.5618115055079559, "mc1_stderr": 0.01736923616440441, "mc2": 0.7034639754228852, "mc2_stderr": 0.014889031021791599 }, "harness|winogrande|5": { "acc": 0.8579321231254933, "acc_stderr": 0.009812000391679369 }, "harness|gsm8k|5": { "acc": 0.6830932524639879, "acc_stderr": 0.012815868296721353 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
aanchalsatyan/Embedding
--- license: mit ---
Shakebird/white_house_speech
--- license: mit language: - en ---
CarlosFersoft/GPBusiness
--- license: mit task_categories: - question-answering language: - es tags: - finance pretty_name: GPBusiness size_categories: - n<1K ---
skaty5678/temp-collator-7711
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 16184544 num_examples: 7711 download_size: 4456541 dataset_size: 16184544 configs: - config_name: default data_files: - split: train path: data/train-* ---
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/d7f07bf2
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 182 num_examples: 10 download_size: 1330 dataset_size: 182 --- # Dataset Card for "d7f07bf2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AkashMnd/prismadgen
--- license: mit ---
open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B
--- pretty_name: Evaluation run of garage-bAInd/Stable-Platypus2-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [garage-bAInd/Stable-Platypus2-13B](https://huggingface.co/garage-bAInd/Stable-Platypus2-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-17T23:47:31.962394](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B/blob/main/results_2023-09-17T23-47-31.962394.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.37531459731543626,\n\ \ \"em_stderr\": 0.004958702554959804,\n \"f1\": 0.45221476510067204,\n\ \ \"f1_stderr\": 0.004729347386559949,\n \"acc\": 0.39347033490847444,\n\ \ \"acc_stderr\": 0.00776582600946219\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.37531459731543626,\n \"em_stderr\": 0.004958702554959804,\n\ \ \"f1\": 0.45221476510067204,\n \"f1_stderr\": 0.004729347386559949\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \ \ \"acc_stderr\": 0.003681611894073872\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n\ \ }\n}\n```" repo_url: https://huggingface.co/garage-bAInd/Stable-Platypus2-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|arc:challenge|25_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-09T15:52:34.927040.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_17T23_47_31.962394 path: - '**/details_harness|drop|3_2023-09-17T23-47-31.962394.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-17T23-47-31.962394.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_17T23_47_31.962394 path: - '**/details_harness|gsm8k|5_2023-09-17T23-47-31.962394.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-17T23-47-31.962394.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hellaswag|10_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:52:34.927040.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-management|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:52:34.927040.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_09T15_52_34.927040 path: - '**/details_harness|truthfulqa:mc|0_2023-08-09T15:52:34.927040.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-09T15:52:34.927040.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_17T23_47_31.962394 path: - '**/details_harness|winogrande|5_2023-09-17T23-47-31.962394.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-17T23-47-31.962394.parquet' - config_name: results data_files: - split: 2023_08_09T15_52_34.927040 path: - results_2023-08-09T15:52:34.927040.parquet - split: 2023_09_17T23_47_31.962394 path: - results_2023-09-17T23-47-31.962394.parquet - split: latest path: - results_2023-09-17T23-47-31.962394.parquet --- # Dataset Card for Evaluation run of garage-bAInd/Stable-Platypus2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/garage-bAInd/Stable-Platypus2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [garage-bAInd/Stable-Platypus2-13B](https://huggingface.co/garage-bAInd/Stable-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T23:47:31.962394](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B/blob/main/results_2023-09-17T23-47-31.962394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.37531459731543626, "em_stderr": 0.004958702554959804, "f1": 0.45221476510067204, "f1_stderr": 0.004729347386559949, "acc": 0.39347033490847444, "acc_stderr": 0.00776582600946219 }, "harness|drop|3": { "em": 0.37531459731543626, "em_stderr": 0.004958702554959804, "f1": 0.45221476510067204, "f1_stderr": 0.004729347386559949 }, "harness|gsm8k|5": { "acc": 0.01819560272934041, "acc_stderr": 0.003681611894073872 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.011850040124850508 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama
--- pretty_name: Evaluation run of chargoddard/internlm2-base-20b-llama dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [chargoddard/internlm2-base-20b-llama](https://huggingface.co/chargoddard/internlm2-base-20b-llama)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-21T08:12:11.575065](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama/blob/main/results_2024-01-21T08-12-11.575065.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6376672606100185,\n\ \ \"acc_stderr\": 0.03233501598179968,\n \"acc_norm\": 0.6426334526719998,\n\ \ \"acc_norm_stderr\": 0.0329801483756123,\n \"mc1\": 0.2913096695226438,\n\ \ \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.43966281100559496,\n\ \ \"mc2_stderr\": 0.014256122898440773\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n\ \ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491888\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6158135829516033,\n\ \ \"acc_stderr\": 0.004854082479916909,\n \"acc_norm\": 0.8210515833499303,\n\ \ \"acc_norm_stderr\": 0.0038252574352092344\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\ \ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\ \ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n\ \ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\ \ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \ \ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\ \ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\ \ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\ \ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\ \ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\ \ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.8,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.8,\n\ \ \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\ \ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\ \ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\ \ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\ \ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305528,\n \"\ acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305528\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\ \ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\ \ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\ \ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\ \ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\ \ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\ : 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \ \ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n \ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215282,\n \"\ acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215282\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\ \ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110943,\n\ \ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110943\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \ \ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977924,\n\ \ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977924\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\ acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092437,\n \"\ acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092437\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\ acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.803921568627451,\n \"acc_stderr\": 0.02786594228663933,\n \"acc_norm\"\ : 0.803921568627451,\n \"acc_norm_stderr\": 0.02786594228663933\n },\n\ \ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\ \ 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"\ acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\ \ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\ \ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\ \ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\ acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\ \ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\ \ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\ \ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\ \ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\ \ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\ \ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\ \ \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n\ \ \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\ \ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.7956577266922095,\n\ \ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\ \ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38994413407821227,\n\ \ \"acc_stderr\": 0.016312376629213067,\n \"acc_norm\": 0.38994413407821227,\n\ \ \"acc_norm_stderr\": 0.016312376629213067\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\ \ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\ \ \"acc_stderr\": 0.025670259242188947,\n \"acc_norm\": 0.7138263665594855,\n\ \ \"acc_norm_stderr\": 0.025670259242188947\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\ \ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419998,\n \ \ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419998\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\ \ \"acc_stderr\": 0.012756161942523365,\n \"acc_norm\": 0.4765319426336376,\n\ \ \"acc_norm_stderr\": 0.012756161942523365\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389844,\n\ \ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389844\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\ \ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\ \ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291313,\n\ \ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291313\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\ \ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\ \ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \ \ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\ \ \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.43966281100559496,\n\ \ \"mc2_stderr\": 0.014256122898440773\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44806671721000757,\n \ \ \"acc_stderr\": 0.013697992668274518\n }\n}\n```" repo_url: https://huggingface.co/chargoddard/internlm2-base-20b-llama leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|arc:challenge|25_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-21T08-12-11.575065.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|gsm8k|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hellaswag|10_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-21T08-12-11.575065.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-management|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-12-11.575065.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|truthfulqa:mc|0_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-21T08-12-11.575065.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_21T08_12_11.575065 path: - '**/details_harness|winogrande|5_2024-01-21T08-12-11.575065.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-21T08-12-11.575065.parquet' - config_name: results data_files: - split: 2024_01_21T08_12_11.575065 path: - results_2024-01-21T08-12-11.575065.parquet - split: latest path: - results_2024-01-21T08-12-11.575065.parquet --- # Dataset Card for Evaluation run of chargoddard/internlm2-base-20b-llama <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [chargoddard/internlm2-base-20b-llama](https://huggingface.co/chargoddard/internlm2-base-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T08:12:11.575065](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama/blob/main/results_2024-01-21T08-12-11.575065.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6376672606100185, "acc_stderr": 0.03233501598179968, "acc_norm": 0.6426334526719998, "acc_norm_stderr": 0.0329801483756123, "mc1": 0.2913096695226438, "mc1_stderr": 0.01590598704818483, "mc2": 0.43966281100559496, "mc2_stderr": 0.014256122898440773 }, "harness|arc:challenge|25": { "acc": 0.5878839590443686, "acc_stderr": 0.014383915302225403, "acc_norm": 0.6305460750853242, "acc_norm_stderr": 0.014104578366491888 }, "harness|hellaswag|10": { "acc": 0.6158135829516033, "acc_stderr": 0.004854082479916909, "acc_norm": 0.8210515833499303, "acc_norm_stderr": 0.0038252574352092344 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.037150621549989056, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.037150621549989056 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544067, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544067 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6358381502890174, "acc_stderr": 0.03669072477416907, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.03669072477416907 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006717, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006717 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036843, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036843 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6042553191489362, "acc_stderr": 0.031967586978353627, "acc_norm": 0.6042553191489362, "acc_norm_stderr": 0.031967586978353627 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.02535574126305528, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.02535574126305528 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377562, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377562 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.023415293433568525, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.023415293433568525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.541871921182266, "acc_stderr": 0.03505630140785741, "acc_norm": 0.541871921182266, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721164, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721164 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026552207828215282, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026552207828215282 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015184, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015184 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6282051282051282, "acc_stderr": 0.024503472557110943, "acc_norm": 0.6282051282051282, "acc_norm_stderr": 0.024503472557110943 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253252, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253252 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977924, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977924 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092437, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092437 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5555555555555556, "acc_stderr": 0.03388857118502325, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.02786594228663933, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.02786594228663933 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.04039314978724561, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.04039314978724561 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.039418975265163025, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.039418975265163025 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094633, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094633 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.033519538795212696, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.033519538795212696 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.02158649400128136, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.02158649400128136 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7956577266922095, "acc_stderr": 0.0144191239809319, "acc_norm": 0.7956577266922095, "acc_norm_stderr": 0.0144191239809319 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6907514450867052, "acc_stderr": 0.02488314057007176, "acc_norm": 0.6907514450867052, "acc_norm_stderr": 0.02488314057007176 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38994413407821227, "acc_stderr": 0.016312376629213067, "acc_norm": 0.38994413407821227, "acc_norm_stderr": 0.016312376629213067 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7091503267973857, "acc_stderr": 0.02600480036395213, "acc_norm": 0.7091503267973857, "acc_norm_stderr": 0.02600480036395213 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188947, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188947 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.024288533637726095, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.024288533637726095 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41843971631205673, "acc_stderr": 0.029427994039419998, "acc_norm": 0.41843971631205673, "acc_norm_stderr": 0.029427994039419998 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4765319426336376, "acc_stderr": 0.012756161942523365, "acc_norm": 0.4765319426336376, "acc_norm_stderr": 0.012756161942523365 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389844, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389844 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0190709855896875, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0190709855896875 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291313, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291313 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.2913096695226438, "mc1_stderr": 0.01590598704818483, "mc2": 0.43966281100559496, "mc2_stderr": 0.014256122898440773 }, "harness|winogrande|5": { "acc": 0.7821625887924231, "acc_stderr": 0.011601066079939324 }, "harness|gsm8k|5": { "acc": 0.44806671721000757, "acc_stderr": 0.013697992668274518 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
dkk2564996/sevkkkkkk
--- license: openrail ---
fiveflow/cot_ranking
--- dataset_info: features: - name: question dtype: string - name: response_j dtype: string - name: response_k dtype: string splits: - name: train num_bytes: 64266082 num_examples: 67830 - name: test num_bytes: 3323500 num_examples: 3570 download_size: 408618 dataset_size: 67589582 --- # Dataset Card for "cot_ranking" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jpbello/common_language_preprocessed
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: client_id dtype: string - name: path dtype: string - name: sentence dtype: string - name: age dtype: string - name: gender dtype: string - name: label dtype: class_label: names: '0': Arabic '1': Basque '2': Breton '3': Catalan '4': Chinese_China '5': Chinese_Hongkong '6': Chinese_Taiwan '7': Chuvash '8': Czech '9': Dhivehi '10': Dutch '11': English '12': Esperanto '13': Estonian '14': French '15': Frisian '16': Georgian '17': German '18': Greek '19': Hakha_Chin '20': Indonesian '21': Interlingua '22': Italian '23': Japanese '24': Kabyle '25': Kinyarwanda '26': Kyrgyz '27': Latvian '28': Maltese '29': Mangolian '30': Persian '31': Polish '32': Portuguese '33': Romanian '34': Romansh_Sursilvan '35': Russian '36': Sakha '37': Slovenian '38': Spanish '39': Swedish '40': Tamil '41': Tatar '42': Turkish '43': Ukranian '44': Welsh - name: input_values sequence: float32 - name: attention_mask sequence: int32 splits: - name: train num_bytes: 13848986619 num_examples: 22194 - name: validation num_bytes: 3461442109 num_examples: 5888 - name: test num_bytes: 3473659131 num_examples: 5963 download_size: 8143061729 dataset_size: 20784087859 --- # Dataset Card for "common_language_preprocessed" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
c01dsnap/top-1m
--- license: cc-by-nc-sa-4.0 --- # Top 1 Million Domains A collection of free and directly downloadable top 1M domain lists, a suitable alternative to the now-retired Alexa.com service. These alternative services offer a solution for obtaining top domain data in the absence of Alexa.com. Please feel free to send a pull request for any other free and downloadable top 1M domain list. ## Lists ### Alexa - Download page: https://alexa.com/ (End of Service at middle of 2022) - Download link: https://s3.amazonaws.com/alexa-static/top-1m.csv.zip ### Cisco Umbrella - Download page: https://s3-us-west-1.amazonaws.com/umbrella-static/index.html - Download link: https://s3-us-west-1.amazonaws.com/umbrella-static/top-1m.csv.zip ### Majestic - Download page: https://majestic.com/reports/majestic-million - Download link: https://downloads.majestic.com/majestic_million.csv ### BuiltWith - Download page: https://builtwith.com/top-1m - Download link: https://builtwith.com/dl/builtwith-top1m.zip ### Statvoo - Download page: https://statvoo.com/top/ranked - Download link: https://statvoo.com/dl/top-1million-sites.csv.zip ### DomCop - Download page: https://www.domcop.com/top-10-million-websites - Download link: https://www.domcop.com/files/top/top10milliondomains.csv.zip ### Tranco - Download page: https://tranco-list.eu/ - Download link: https://tranco-list.eu/top-1m.csv.zip ### Cloudflare - Download page: https://radar.cloudflare.com/domains - Download link: https://radar.cloudflare.com/charts/LargerTopDomainsTable/attachment?id=525&top=1000000
lucianosb/cetacean-ptbr
--- dataset_info: features: - name: instruction dtype: string - name: response dtype: string splits: - name: train num_bytes: 2314477136 num_examples: 1409162 download_size: 1413883118 dataset_size: 2314477136 configs: - config_name: default data_files: - split: train path: data/train-* license: mit language: - pt --- This dataset is a merge of [Open-Orca](https://huggingface.co/datasets/cnmoro/GPT4-500k-Augmented-PTBR-Clean) and [Dolphin](https://huggingface.co/datasets/JJhooww/dolphin_ptbr_alpaca_format) translated to portuguese.
open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chat
--- pretty_name: Evaluation run of giraffe176/WestLake_Noromaid_OpenHermes_neural-chat dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [giraffe176/WestLake_Noromaid_OpenHermes_neural-chat](https://huggingface.co/giraffe176/WestLake_Noromaid_OpenHermes_neural-chat)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chat\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-02T16:42:40.853852](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chat/blob/main/results_2024-03-02T16-42-40.853852.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6502553416511081,\n\ \ \"acc_stderr\": 0.03213553415502459,\n \"acc_norm\": 0.6513459760802845,\n\ \ \"acc_norm_stderr\": 0.03278721251513175,\n \"mc1\": 0.390452876376989,\n\ \ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5546796539463894,\n\ \ \"mc2_stderr\": 0.015237857217982279\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6382252559726962,\n \"acc_stderr\": 0.014041957945038076,\n\ \ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518827\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6741684923322048,\n\ \ \"acc_stderr\": 0.0046772682828393995,\n \"acc_norm\": 0.8612826130252937,\n\ \ \"acc_norm_stderr\": 0.003449449618650543\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\ \ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n\ \ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\ \ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\ \ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \ \ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\ : 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\ \ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\ \ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\ \ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\ \ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\ \ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\ \ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\ \ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\ \ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\ acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\ \ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\ \ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\ \ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\ \ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\ \ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\ \ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\ acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\ \ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n\ \ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \ \ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342856,\n\ \ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342856\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\ acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\ acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\ : 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n\ \ \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n\ \ \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n\ \ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\ \ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\ acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\ \ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\ \ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\ \ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\ \ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\ \ \"acc_stderr\": 0.01322392861674162,\n \"acc_norm\": 0.8365261813537676,\n\ \ \"acc_norm_stderr\": 0.01322392861674162\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\ \ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n\ \ \"acc_stderr\": 0.01580100372914589,\n \"acc_norm\": 0.33631284916201115,\n\ \ \"acc_norm_stderr\": 0.01580100372914589\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\ \ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\ \ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\ \ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\ \ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \ \ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\ \ \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n\ \ \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\ \ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\ \ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\ \ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n\ \ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\ \ \"acc_stderr\": 0.02411267824090081,\n \"acc_norm\": 0.8656716417910447,\n\ \ \"acc_norm_stderr\": 0.02411267824090081\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\ \ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\ \ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\ \ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\ \ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5546796539463894,\n\ \ \"mc2_stderr\": 0.015237857217982279\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218327\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6565579984836998,\n \ \ \"acc_stderr\": 0.01307993381180031\n }\n}\n```" repo_url: https://huggingface.co/giraffe176/WestLake_Noromaid_OpenHermes_neural-chat leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|arc:challenge|25_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-02T16-42-40.853852.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|gsm8k|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hellaswag|10_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-02T16-42-40.853852.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-management|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T16-42-40.853852.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|truthfulqa:mc|0_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-02T16-42-40.853852.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_02T16_42_40.853852 path: - '**/details_harness|winogrande|5_2024-03-02T16-42-40.853852.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-02T16-42-40.853852.parquet' - config_name: results data_files: - split: 2024_03_02T16_42_40.853852 path: - results_2024-03-02T16-42-40.853852.parquet - split: latest path: - results_2024-03-02T16-42-40.853852.parquet --- # Dataset Card for Evaluation run of giraffe176/WestLake_Noromaid_OpenHermes_neural-chat <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [giraffe176/WestLake_Noromaid_OpenHermes_neural-chat](https://huggingface.co/giraffe176/WestLake_Noromaid_OpenHermes_neural-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-02T16:42:40.853852](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__WestLake_Noromaid_OpenHermes_neural-chat/blob/main/results_2024-03-02T16-42-40.853852.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6502553416511081, "acc_stderr": 0.03213553415502459, "acc_norm": 0.6513459760802845, "acc_norm_stderr": 0.03278721251513175, "mc1": 0.390452876376989, "mc1_stderr": 0.017078230743431448, "mc2": 0.5546796539463894, "mc2_stderr": 0.015237857217982279 }, "harness|arc:challenge|25": { "acc": 0.6382252559726962, "acc_stderr": 0.014041957945038076, "acc_norm": 0.6757679180887372, "acc_norm_stderr": 0.013678810399518827 }, "harness|hellaswag|10": { "acc": 0.6741684923322048, "acc_stderr": 0.0046772682828393995, "acc_norm": 0.8612826130252937, "acc_norm_stderr": 0.003449449618650543 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.028254200344438665, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.028254200344438665 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5957446808510638, "acc_stderr": 0.03208115750788684, "acc_norm": 0.5957446808510638, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.04702880432049615, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778408, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778408 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919443, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.023854795680971128, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.023854795680971128 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.362962962962963, "acc_stderr": 0.02931820364520686, "acc_norm": 0.362962962962963, "acc_norm_stderr": 0.02931820364520686 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7016806722689075, "acc_stderr": 0.029719142876342856, "acc_norm": 0.7016806722689075, "acc_norm_stderr": 0.029719142876342856 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.028125972265654373, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.028125972265654373 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389094, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389094 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313729, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313729 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597528, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597528 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8365261813537676, "acc_stderr": 0.01322392861674162, "acc_norm": 0.8365261813537676, "acc_norm_stderr": 0.01322392861674162 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33631284916201115, "acc_stderr": 0.01580100372914589, "acc_norm": 0.33631284916201115, "acc_norm_stderr": 0.01580100372914589 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.02399350170904211, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.02399350170904211 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.029779450957303062, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.029779450957303062 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46153846153846156, "acc_stderr": 0.01273239828619044, "acc_norm": 0.46153846153846156, "acc_norm_stderr": 0.01273239828619044 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6764705882352942, "acc_stderr": 0.018926082916083383, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.018926082916083383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.746938775510204, "acc_stderr": 0.02783302387139968, "acc_norm": 0.746938775510204, "acc_norm_stderr": 0.02783302387139968 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8656716417910447, "acc_stderr": 0.02411267824090081, "acc_norm": 0.8656716417910447, "acc_norm_stderr": 0.02411267824090081 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.02796678585916089, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.02796678585916089 }, "harness|truthfulqa:mc|0": { "mc1": 0.390452876376989, "mc1_stderr": 0.017078230743431448, "mc2": 0.5546796539463894, "mc2_stderr": 0.015237857217982279 }, "harness|winogrande|5": { "acc": 0.8042620363062352, "acc_stderr": 0.011151145042218327 }, "harness|gsm8k|5": { "acc": 0.6565579984836998, "acc_stderr": 0.01307993381180031 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
huolongguo10/MultiChat
--- license: openrail task_categories: - conversational language: - zh tags: - code ---
liuyanchen1015/MULTI_VALUE_mrpc_null_relcl
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 83150 num_examples: 293 - name: train num_bytes: 189202 num_examples: 660 - name: validation num_bytes: 21484 num_examples: 75 download_size: 203711 dataset_size: 293836 --- # Dataset Card for "MULTI_VALUE_mrpc_null_relcl" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
datahrvoje/twitter_dataset_1713006552
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 20501 num_examples: 45 download_size: 11125 dataset_size: 20501 configs: - config_name: default data_files: - split: train path: data/train-* ---
EgilKarlsen/Thunderbird_GPT2_Baseline
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: '0' dtype: float32 - name: '1' dtype: float32 - name: '2' dtype: float32 - name: '3' dtype: float32 - name: '4' dtype: float32 - name: '5' dtype: float32 - name: '6' dtype: float32 - name: '7' dtype: float32 - name: '8' dtype: float32 - name: '9' dtype: float32 - name: '10' dtype: float32 - name: '11' dtype: float32 - name: '12' dtype: float32 - name: '13' dtype: float32 - name: '14' dtype: float32 - name: '15' dtype: float32 - name: '16' dtype: float32 - name: '17' dtype: float32 - name: '18' dtype: float32 - name: '19' dtype: float32 - name: '20' dtype: float32 - name: '21' dtype: float32 - name: '22' dtype: float32 - name: '23' dtype: float32 - name: '24' dtype: float32 - name: '25' dtype: float32 - name: '26' dtype: float32 - name: '27' dtype: float32 - name: '28' dtype: float32 - name: '29' dtype: float32 - name: '30' dtype: float32 - name: '31' dtype: float32 - name: '32' dtype: float32 - name: '33' dtype: float32 - name: '34' dtype: float32 - name: '35' dtype: float32 - name: '36' dtype: float32 - name: '37' dtype: float32 - name: '38' dtype: float32 - name: '39' dtype: float32 - name: '40' dtype: float32 - name: '41' dtype: float32 - name: '42' dtype: float32 - name: '43' dtype: float32 - name: '44' dtype: float32 - name: '45' dtype: float32 - name: '46' dtype: float32 - name: '47' dtype: float32 - name: '48' dtype: float32 - name: '49' dtype: float32 - name: '50' dtype: float32 - name: '51' dtype: float32 - name: '52' dtype: float32 - name: '53' dtype: float32 - name: '54' dtype: float32 - name: '55' dtype: float32 - name: '56' dtype: float32 - name: '57' dtype: float32 - name: '58' dtype: float32 - name: '59' dtype: float32 - name: '60' dtype: float32 - name: '61' dtype: float32 - name: '62' dtype: float32 - name: '63' dtype: float32 - name: '64' dtype: float32 - name: '65' dtype: float32 - name: '66' dtype: float32 - name: '67' dtype: float32 - name: '68' dtype: float32 - name: '69' dtype: float32 - name: '70' dtype: float32 - name: '71' dtype: float32 - name: '72' dtype: float32 - name: '73' dtype: float32 - name: '74' dtype: float32 - name: '75' dtype: float32 - name: '76' dtype: float32 - name: '77' dtype: float32 - name: '78' dtype: float32 - name: '79' dtype: float32 - name: '80' dtype: float32 - name: '81' dtype: float32 - name: '82' dtype: float32 - name: '83' dtype: float32 - name: '84' dtype: float32 - name: '85' dtype: float32 - name: '86' dtype: float32 - name: '87' dtype: float32 - name: '88' dtype: float32 - name: '89' dtype: float32 - name: '90' dtype: float32 - name: '91' dtype: float32 - name: '92' dtype: float32 - name: '93' dtype: float32 - name: '94' dtype: float32 - name: '95' dtype: float32 - name: '96' dtype: float32 - name: '97' dtype: float32 - name: '98' dtype: float32 - name: '99' dtype: float32 - name: '100' dtype: float32 - name: '101' dtype: float32 - name: '102' dtype: float32 - name: '103' dtype: float32 - name: '104' dtype: float32 - name: '105' dtype: float32 - name: '106' dtype: float32 - name: '107' dtype: float32 - name: '108' dtype: float32 - name: '109' dtype: float32 - name: '110' dtype: float32 - name: '111' dtype: float32 - name: '112' dtype: float32 - name: '113' dtype: float32 - name: '114' dtype: float32 - name: '115' dtype: float32 - name: '116' dtype: float32 - name: '117' dtype: float32 - name: '118' dtype: float32 - name: '119' dtype: float32 - name: '120' dtype: float32 - name: '121' dtype: float32 - name: '122' dtype: float32 - name: '123' dtype: float32 - name: '124' dtype: float32 - name: '125' dtype: float32 - name: '126' dtype: float32 - name: '127' dtype: float32 - name: '128' dtype: float32 - name: '129' dtype: float32 - name: '130' dtype: float32 - name: '131' dtype: float32 - name: '132' dtype: float32 - name: '133' dtype: float32 - name: '134' dtype: float32 - name: '135' dtype: float32 - name: '136' dtype: float32 - name: '137' dtype: float32 - name: '138' dtype: float32 - name: '139' dtype: float32 - name: '140' dtype: float32 - name: '141' dtype: float32 - name: '142' dtype: float32 - name: '143' dtype: float32 - name: '144' dtype: float32 - name: '145' dtype: float32 - name: '146' dtype: float32 - name: '147' dtype: float32 - name: '148' dtype: float32 - name: '149' dtype: float32 - name: '150' dtype: float32 - name: '151' dtype: float32 - name: '152' dtype: float32 - name: '153' dtype: float32 - name: '154' dtype: float32 - name: '155' dtype: float32 - name: '156' dtype: float32 - name: '157' dtype: float32 - name: '158' dtype: float32 - name: '159' dtype: float32 - name: '160' dtype: float32 - name: '161' dtype: float32 - name: '162' dtype: float32 - name: '163' dtype: float32 - name: '164' dtype: float32 - name: '165' dtype: float32 - name: '166' dtype: float32 - name: '167' dtype: float32 - name: '168' dtype: float32 - name: '169' dtype: float32 - name: '170' dtype: float32 - name: '171' dtype: float32 - name: '172' dtype: float32 - name: '173' dtype: float32 - name: '174' dtype: float32 - name: '175' dtype: float32 - name: '176' dtype: float32 - name: '177' dtype: float32 - name: '178' dtype: float32 - name: '179' dtype: float32 - name: '180' dtype: float32 - name: '181' dtype: float32 - name: '182' dtype: float32 - name: '183' dtype: float32 - name: '184' dtype: float32 - name: '185' dtype: float32 - name: '186' dtype: float32 - name: '187' dtype: float32 - name: '188' dtype: float32 - name: '189' dtype: float32 - name: '190' dtype: float32 - name: '191' dtype: float32 - name: '192' dtype: float32 - name: '193' dtype: float32 - name: '194' dtype: float32 - name: '195' dtype: float32 - name: '196' dtype: float32 - name: '197' dtype: float32 - name: '198' dtype: float32 - name: '199' dtype: float32 - name: '200' dtype: float32 - name: '201' dtype: float32 - name: '202' dtype: float32 - name: '203' dtype: float32 - name: '204' dtype: float32 - name: '205' dtype: float32 - name: '206' dtype: float32 - name: '207' dtype: float32 - name: '208' dtype: float32 - name: '209' dtype: float32 - name: '210' dtype: float32 - name: '211' dtype: float32 - name: '212' dtype: float32 - name: '213' dtype: float32 - name: '214' dtype: float32 - name: '215' dtype: float32 - name: '216' dtype: float32 - name: '217' dtype: float32 - name: '218' dtype: float32 - name: '219' dtype: float32 - name: '220' dtype: float32 - name: '221' dtype: float32 - name: '222' dtype: float32 - name: '223' dtype: float32 - name: '224' dtype: float32 - name: '225' dtype: float32 - name: '226' dtype: float32 - name: '227' dtype: float32 - name: '228' dtype: float32 - name: '229' dtype: float32 - name: '230' dtype: float32 - name: '231' dtype: float32 - name: '232' dtype: float32 - name: '233' dtype: float32 - name: '234' dtype: float32 - name: '235' dtype: float32 - name: '236' dtype: float32 - name: '237' dtype: float32 - name: '238' dtype: float32 - name: '239' dtype: float32 - name: '240' dtype: float32 - name: '241' dtype: float32 - name: '242' dtype: float32 - name: '243' dtype: float32 - name: '244' dtype: float32 - name: '245' dtype: float32 - name: '246' dtype: float32 - name: '247' dtype: float32 - name: '248' dtype: float32 - name: '249' dtype: float32 - name: '250' dtype: float32 - name: '251' dtype: float32 - name: '252' dtype: float32 - name: '253' dtype: float32 - name: '254' dtype: float32 - name: '255' dtype: float32 - name: '256' dtype: float32 - name: '257' dtype: float32 - name: '258' dtype: float32 - name: '259' dtype: float32 - name: '260' dtype: float32 - name: '261' dtype: float32 - name: '262' dtype: float32 - name: '263' dtype: float32 - name: '264' dtype: float32 - name: '265' dtype: float32 - name: '266' dtype: float32 - name: '267' dtype: float32 - name: '268' dtype: float32 - name: '269' dtype: float32 - name: '270' dtype: float32 - name: '271' dtype: float32 - name: '272' dtype: float32 - name: '273' dtype: float32 - name: '274' dtype: float32 - name: '275' dtype: float32 - name: '276' dtype: float32 - name: '277' dtype: float32 - name: '278' dtype: float32 - name: '279' dtype: float32 - name: '280' dtype: float32 - name: '281' dtype: float32 - name: '282' dtype: float32 - name: '283' dtype: float32 - name: '284' dtype: float32 - name: '285' dtype: float32 - name: '286' dtype: float32 - name: '287' dtype: float32 - name: '288' dtype: float32 - name: '289' dtype: float32 - name: '290' dtype: float32 - name: '291' dtype: float32 - name: '292' dtype: float32 - name: '293' dtype: float32 - name: '294' dtype: float32 - name: '295' dtype: float32 - name: '296' dtype: float32 - name: '297' dtype: float32 - name: '298' dtype: float32 - name: '299' dtype: float32 - name: '300' dtype: float32 - name: '301' dtype: float32 - name: '302' dtype: float32 - name: '303' dtype: float32 - name: '304' dtype: float32 - name: '305' dtype: float32 - name: '306' dtype: float32 - name: '307' dtype: float32 - name: '308' dtype: float32 - name: '309' dtype: float32 - name: '310' dtype: float32 - name: '311' dtype: float32 - name: '312' dtype: float32 - name: '313' dtype: float32 - name: '314' dtype: float32 - name: '315' dtype: float32 - name: '316' dtype: float32 - name: '317' dtype: float32 - name: '318' dtype: float32 - name: '319' dtype: float32 - name: '320' dtype: float32 - name: '321' dtype: float32 - name: '322' dtype: float32 - name: '323' dtype: float32 - name: '324' dtype: float32 - name: '325' dtype: float32 - name: '326' dtype: float32 - name: '327' dtype: float32 - name: '328' dtype: float32 - name: '329' dtype: float32 - name: '330' dtype: float32 - name: '331' dtype: float32 - name: '332' dtype: float32 - name: '333' dtype: float32 - name: '334' dtype: float32 - name: '335' dtype: float32 - name: '336' dtype: float32 - name: '337' dtype: float32 - name: '338' dtype: float32 - name: '339' dtype: float32 - name: '340' dtype: float32 - name: '341' dtype: float32 - name: '342' dtype: float32 - name: '343' dtype: float32 - name: '344' dtype: float32 - name: '345' dtype: float32 - name: '346' dtype: float32 - name: '347' dtype: float32 - name: '348' dtype: float32 - name: '349' dtype: float32 - name: '350' dtype: float32 - name: '351' dtype: float32 - name: '352' dtype: float32 - name: '353' dtype: float32 - name: '354' dtype: float32 - name: '355' dtype: float32 - name: '356' dtype: float32 - name: '357' dtype: float32 - name: '358' dtype: float32 - name: '359' dtype: float32 - name: '360' dtype: float32 - name: '361' dtype: float32 - name: '362' dtype: float32 - name: '363' dtype: float32 - name: '364' dtype: float32 - name: '365' dtype: float32 - name: '366' dtype: float32 - name: '367' dtype: float32 - name: '368' dtype: float32 - name: '369' dtype: float32 - name: '370' dtype: float32 - name: '371' dtype: float32 - name: '372' dtype: float32 - name: '373' dtype: float32 - name: '374' dtype: float32 - name: '375' dtype: float32 - name: '376' dtype: float32 - name: '377' dtype: float32 - name: '378' dtype: float32 - name: '379' dtype: float32 - name: '380' dtype: float32 - name: '381' dtype: float32 - name: '382' dtype: float32 - name: '383' dtype: float32 - name: '384' dtype: float32 - name: '385' dtype: float32 - name: '386' dtype: float32 - name: '387' dtype: float32 - name: '388' dtype: float32 - name: '389' dtype: float32 - name: '390' dtype: float32 - name: '391' dtype: float32 - name: '392' dtype: float32 - name: '393' dtype: float32 - name: '394' dtype: float32 - name: '395' dtype: float32 - name: '396' dtype: float32 - name: '397' dtype: float32 - name: '398' dtype: float32 - name: '399' dtype: float32 - name: '400' dtype: float32 - name: '401' dtype: float32 - name: '402' dtype: float32 - name: '403' dtype: float32 - name: '404' dtype: float32 - name: '405' dtype: float32 - name: '406' dtype: float32 - name: '407' dtype: float32 - name: '408' dtype: float32 - name: '409' dtype: float32 - name: '410' dtype: float32 - name: '411' dtype: float32 - name: '412' dtype: float32 - name: '413' dtype: float32 - name: '414' dtype: float32 - name: '415' dtype: float32 - name: '416' dtype: float32 - name: '417' dtype: float32 - name: '418' dtype: float32 - name: '419' dtype: float32 - name: '420' dtype: float32 - name: '421' dtype: float32 - name: '422' dtype: float32 - name: '423' dtype: float32 - name: '424' dtype: float32 - name: '425' dtype: float32 - name: '426' dtype: float32 - name: '427' dtype: float32 - name: '428' dtype: float32 - name: '429' dtype: float32 - name: '430' dtype: float32 - name: '431' dtype: float32 - name: '432' dtype: float32 - name: '433' dtype: float32 - name: '434' dtype: float32 - name: '435' dtype: float32 - name: '436' dtype: float32 - name: '437' dtype: float32 - name: '438' dtype: float32 - name: '439' dtype: float32 - name: '440' dtype: float32 - name: '441' dtype: float32 - name: '442' dtype: float32 - name: '443' dtype: float32 - name: '444' dtype: float32 - name: '445' dtype: float32 - name: '446' dtype: float32 - name: '447' dtype: float32 - name: '448' dtype: float32 - name: '449' dtype: float32 - name: '450' dtype: float32 - name: '451' dtype: float32 - name: '452' dtype: float32 - name: '453' dtype: float32 - name: '454' dtype: float32 - name: '455' dtype: float32 - name: '456' dtype: float32 - name: '457' dtype: float32 - name: '458' dtype: float32 - name: '459' dtype: float32 - name: '460' dtype: float32 - name: '461' dtype: float32 - name: '462' dtype: float32 - name: '463' dtype: float32 - name: '464' dtype: float32 - name: '465' dtype: float32 - name: '466' dtype: float32 - name: '467' dtype: float32 - name: '468' dtype: float32 - name: '469' dtype: float32 - name: '470' dtype: float32 - name: '471' dtype: float32 - name: '472' dtype: float32 - name: '473' dtype: float32 - name: '474' dtype: float32 - name: '475' dtype: float32 - name: '476' dtype: float32 - name: '477' dtype: float32 - name: '478' dtype: float32 - name: '479' dtype: float32 - name: '480' dtype: float32 - name: '481' dtype: float32 - name: '482' dtype: float32 - name: '483' dtype: float32 - name: '484' dtype: float32 - name: '485' dtype: float32 - name: '486' dtype: float32 - name: '487' dtype: float32 - name: '488' dtype: float32 - name: '489' dtype: float32 - name: '490' dtype: float32 - name: '491' dtype: float32 - name: '492' dtype: float32 - name: '493' dtype: float32 - name: '494' dtype: float32 - name: '495' dtype: float32 - name: '496' dtype: float32 - name: '497' dtype: float32 - name: '498' dtype: float32 - name: '499' dtype: float32 - name: '500' dtype: float32 - name: '501' dtype: float32 - name: '502' dtype: float32 - name: '503' dtype: float32 - name: '504' dtype: float32 - name: '505' dtype: float32 - name: '506' dtype: float32 - name: '507' dtype: float32 - name: '508' dtype: float32 - name: '509' dtype: float32 - name: '510' dtype: float32 - name: '511' dtype: float32 - name: '512' dtype: float32 - name: '513' dtype: float32 - name: '514' dtype: float32 - name: '515' dtype: float32 - name: '516' dtype: float32 - name: '517' dtype: float32 - name: '518' dtype: float32 - name: '519' dtype: float32 - name: '520' dtype: float32 - name: '521' dtype: float32 - name: '522' dtype: float32 - name: '523' dtype: float32 - name: '524' dtype: float32 - name: '525' dtype: float32 - name: '526' dtype: float32 - name: '527' dtype: float32 - name: '528' dtype: float32 - name: '529' dtype: float32 - name: '530' dtype: float32 - name: '531' dtype: float32 - name: '532' dtype: float32 - name: '533' dtype: float32 - name: '534' dtype: float32 - name: '535' dtype: float32 - name: '536' dtype: float32 - name: '537' dtype: float32 - name: '538' dtype: float32 - name: '539' dtype: float32 - name: '540' dtype: float32 - name: '541' dtype: float32 - name: '542' dtype: float32 - name: '543' dtype: float32 - name: '544' dtype: float32 - name: '545' dtype: float32 - name: '546' dtype: float32 - name: '547' dtype: float32 - name: '548' dtype: float32 - name: '549' dtype: float32 - name: '550' dtype: float32 - name: '551' dtype: float32 - name: '552' dtype: float32 - name: '553' dtype: float32 - name: '554' dtype: float32 - name: '555' dtype: float32 - name: '556' dtype: float32 - name: '557' dtype: float32 - name: '558' dtype: float32 - name: '559' dtype: float32 - name: '560' dtype: float32 - name: '561' dtype: float32 - name: '562' dtype: float32 - name: '563' dtype: float32 - name: '564' dtype: float32 - name: '565' dtype: float32 - name: '566' dtype: float32 - name: '567' dtype: float32 - name: '568' dtype: float32 - name: '569' dtype: float32 - name: '570' dtype: float32 - name: '571' dtype: float32 - name: '572' dtype: float32 - name: '573' dtype: float32 - name: '574' dtype: float32 - name: '575' dtype: float32 - name: '576' dtype: float32 - name: '577' dtype: float32 - name: '578' dtype: float32 - name: '579' dtype: float32 - name: '580' dtype: float32 - name: '581' dtype: float32 - name: '582' dtype: float32 - name: '583' dtype: float32 - name: '584' dtype: float32 - name: '585' dtype: float32 - name: '586' dtype: float32 - name: '587' dtype: float32 - name: '588' dtype: float32 - name: '589' dtype: float32 - name: '590' dtype: float32 - name: '591' dtype: float32 - name: '592' dtype: float32 - name: '593' dtype: float32 - name: '594' dtype: float32 - name: '595' dtype: float32 - name: '596' dtype: float32 - name: '597' dtype: float32 - name: '598' dtype: float32 - name: '599' dtype: float32 - name: '600' dtype: float32 - name: '601' dtype: float32 - name: '602' dtype: float32 - name: '603' dtype: float32 - name: '604' dtype: float32 - name: '605' dtype: float32 - name: '606' dtype: float32 - name: '607' dtype: float32 - name: '608' dtype: float32 - name: '609' dtype: float32 - name: '610' dtype: float32 - name: '611' dtype: float32 - name: '612' dtype: float32 - name: '613' dtype: float32 - name: '614' dtype: float32 - name: '615' dtype: float32 - name: '616' dtype: float32 - name: '617' dtype: float32 - name: '618' dtype: float32 - name: '619' dtype: float32 - name: '620' dtype: float32 - name: '621' dtype: float32 - name: '622' dtype: float32 - name: '623' dtype: float32 - name: '624' dtype: float32 - name: '625' dtype: float32 - name: '626' dtype: float32 - name: '627' dtype: float32 - name: '628' dtype: float32 - name: '629' dtype: float32 - name: '630' dtype: float32 - name: '631' dtype: float32 - name: '632' dtype: float32 - name: '633' dtype: float32 - name: '634' dtype: float32 - name: '635' dtype: float32 - name: '636' dtype: float32 - name: '637' dtype: float32 - name: '638' dtype: float32 - name: '639' dtype: float32 - name: '640' dtype: float32 - name: '641' dtype: float32 - name: '642' dtype: float32 - name: '643' dtype: float32 - name: '644' dtype: float32 - name: '645' dtype: float32 - name: '646' dtype: float32 - name: '647' dtype: float32 - name: '648' dtype: float32 - name: '649' dtype: float32 - name: '650' dtype: float32 - name: '651' dtype: float32 - name: '652' dtype: float32 - name: '653' dtype: float32 - name: '654' dtype: float32 - name: '655' dtype: float32 - name: '656' dtype: float32 - name: '657' dtype: float32 - name: '658' dtype: float32 - name: '659' dtype: float32 - name: '660' dtype: float32 - name: '661' dtype: float32 - name: '662' dtype: float32 - name: '663' dtype: float32 - name: '664' dtype: float32 - name: '665' dtype: float32 - name: '666' dtype: float32 - name: '667' dtype: float32 - name: '668' dtype: float32 - name: '669' dtype: float32 - name: '670' dtype: float32 - name: '671' dtype: float32 - name: '672' dtype: float32 - name: '673' dtype: float32 - name: '674' dtype: float32 - name: '675' dtype: float32 - name: '676' dtype: float32 - name: '677' dtype: float32 - name: '678' dtype: float32 - name: '679' dtype: float32 - name: '680' dtype: float32 - name: '681' dtype: float32 - name: '682' dtype: float32 - name: '683' dtype: float32 - name: '684' dtype: float32 - name: '685' dtype: float32 - name: '686' dtype: float32 - name: '687' dtype: float32 - name: '688' dtype: float32 - name: '689' dtype: float32 - name: '690' dtype: float32 - name: '691' dtype: float32 - name: '692' dtype: float32 - name: '693' dtype: float32 - name: '694' dtype: float32 - name: '695' dtype: float32 - name: '696' dtype: float32 - name: '697' dtype: float32 - name: '698' dtype: float32 - name: '699' dtype: float32 - name: '700' dtype: float32 - name: '701' dtype: float32 - name: '702' dtype: float32 - name: '703' dtype: float32 - name: '704' dtype: float32 - name: '705' dtype: float32 - name: '706' dtype: float32 - name: '707' dtype: float32 - name: '708' dtype: float32 - name: '709' dtype: float32 - name: '710' dtype: float32 - name: '711' dtype: float32 - name: '712' dtype: float32 - name: '713' dtype: float32 - name: '714' dtype: float32 - name: '715' dtype: float32 - name: '716' dtype: float32 - name: '717' dtype: float32 - name: '718' dtype: float32 - name: '719' dtype: float32 - name: '720' dtype: float32 - name: '721' dtype: float32 - name: '722' dtype: float32 - name: '723' dtype: float32 - name: '724' dtype: float32 - name: '725' dtype: float32 - name: '726' dtype: float32 - name: '727' dtype: float32 - name: '728' dtype: float32 - name: '729' dtype: float32 - name: '730' dtype: float32 - name: '731' dtype: float32 - name: '732' dtype: float32 - name: '733' dtype: float32 - name: '734' dtype: float32 - name: '735' dtype: float32 - name: '736' dtype: float32 - name: '737' dtype: float32 - name: '738' dtype: float32 - name: '739' dtype: float32 - name: '740' dtype: float32 - name: '741' dtype: float32 - name: '742' dtype: float32 - name: '743' dtype: float32 - name: '744' dtype: float32 - name: '745' dtype: float32 - name: '746' dtype: float32 - name: '747' dtype: float32 - name: '748' dtype: float32 - name: '749' dtype: float32 - name: '750' dtype: float32 - name: '751' dtype: float32 - name: '752' dtype: float32 - name: '753' dtype: float32 - name: '754' dtype: float32 - name: '755' dtype: float32 - name: '756' dtype: float32 - name: '757' dtype: float32 - name: '758' dtype: float32 - name: '759' dtype: float32 - name: '760' dtype: float32 - name: '761' dtype: float32 - name: '762' dtype: float32 - name: '763' dtype: float32 - name: '764' dtype: float32 - name: '765' dtype: float32 - name: '766' dtype: float32 - name: '767' dtype: float32 - name: label dtype: string splits: - name: train num_bytes: 115576729.6875 num_examples: 37500 - name: test num_bytes: 38525577.5 num_examples: 12500 download_size: 0 dataset_size: 154102307.1875 --- # Dataset Card for "Thunderbird_GPT2_Baseline" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Amit/ddpm-butterflies-128
--- license: unknown ---
dim/yandex_q_200k
--- dataset_info: features: - name: description dtype: string - name: question dtype: string - name: answer dtype: string splits: - name: train num_bytes: 291927288.0830295 num_examples: 200000 download_size: 155069887 dataset_size: 291927288.0830295 --- # Dataset Card for "yandex_q_200k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ggul-tiger/negobot_price_weak_datas
--- dataset_info: features: - name: price dtype: int64 - name: events list: - name: message dtype: string - name: role dtype: string splits: - name: train num_bytes: 59481 num_examples: 300 download_size: 15988 dataset_size: 59481 --- # Dataset Card for "negobot_price_weak_datas" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Kamyar-zeinalipour/turkish_train_v2
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 57527348 num_examples: 187395 download_size: 9804859 dataset_size: 57527348 --- # Dataset Card for "turkish_train_v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Romildon/boa
--- license: openrail ---
menyo20k_mt
--- annotations_creators: - expert-generated - found language_creators: - found language: - en - yo license: - cc-by-nc-4.0 multilinguality: - translation size_categories: - 10K<n<100K source_datasets: - original task_categories: - translation task_ids: [] paperswithcode_id: menyo-20k pretty_name: MENYO-20k dataset_info: features: - name: translation dtype: translation: languages: - en - yo config_name: menyo20k_mt splits: - name: train num_bytes: 2551345 num_examples: 10070 - name: validation num_bytes: 870011 num_examples: 3397 - name: test num_bytes: 1905432 num_examples: 6633 download_size: 5206234 dataset_size: 5326788 --- # Dataset Card for MENYO-20k ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** - **Repository:** https://github.com/uds-lsv/menyo-20k_MT/ - **Paper:** [The Effect of Domain and Diacritics in Yorùbá-English Neural Machine Translation](https://arxiv.org/abs/2103.08647) - **Leaderboard:** - **Point of Contact:** ### Dataset Summary MENYO-20k is a multi-domain parallel dataset with texts obtained from news articles, ted talks, movie transcripts, radio transcripts, science and technology texts, and other short articles curated from the web and professional translators. The dataset has 20,100 parallel sentences split into 10,070 training sentences, 3,397 development sentences, and 6,633 test sentences (3,419 multi-domain, 1,714 news domain, and 1,500 ted talks speech transcript domain). ### Supported Tasks and Leaderboards [More Information Needed] ### Languages Languages are English and Yoruba. ## Dataset Structure ### Data Instances An instance example: ``` {'translation': {'en': 'Unit 1: What is Creative Commons?', 'yo': 'Ìdá 1: Kín ni Creative Commons?' } } ``` ### Data Fields - `translation`: - `en`: English sentence. - `yo`: Yoruba sentence. ### Data Splits Training, validation and test splits are available. ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information The dataset is open but for non-commercial use because some data sources like Ted talks and JW news require permission for commercial use. The dataset is licensed under Creative Commons [Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/) License: https://github.com/uds-lsv/menyo-20k_MT/blob/master/LICENSE ### Citation Information If you use this dataset, please cite this paper: ``` @inproceedings{adelani-etal-2021-effect, title = "The Effect of Domain and Diacritics in {Y}oruba{--}{E}nglish Neural Machine Translation", author = "Adelani, David and Ruiter, Dana and Alabi, Jesujoba and Adebonojo, Damilola and Ayeni, Adesina and Adeyemi, Mofe and Awokoya, Ayodele Esther and Espa{\~n}a-Bonet, Cristina", booktitle = "Proceedings of the 18th Biennial Machine Translation Summit (Volume 1: Research Track)", month = aug, year = "2021", address = "Virtual", publisher = "Association for Machine Translation in the Americas", url = "https://aclanthology.org/2021.mtsummit-research.6", pages = "61--75", abstract = "Massively multilingual machine translation (MT) has shown impressive capabilities and including zero and few-shot translation between low-resource language pairs. However and these models are often evaluated on high-resource languages with the assumption that they generalize to low-resource ones. The difficulty of evaluating MT models on low-resource pairs is often due to lack of standardized evaluation datasets. In this paper and we present MENYO-20k and the first multi-domain parallel corpus with a especially curated orthography for Yoruba{--}English with standardized train-test splits for benchmarking. We provide several neural MT benchmarks and compare them to the performance of popular pre-trained (massively multilingual) MT models both for the heterogeneous test set and its subdomains. Since these pre-trained models use huge amounts of data with uncertain quality and we also analyze the effect of diacritics and a major characteristic of Yoruba and in the training data. We investigate how and when this training condition affects the final quality of a translation and its understandability.Our models outperform massively multilingual models such as Google ($+8.7$ BLEU) and Facebook M2M ($+9.1$) when translating to Yoruba and setting a high quality benchmark for future research.", } ``` ### Contributions Thanks to [@yvonnegitau](https://github.com/yvonnegitau) for adding this dataset.
BigAction/mind2web_clean
--- dataset_info: features: - name: 'Unnamed: 0' dtype: int64 - name: split dtype: string - name: annotation_uid dtype: string - name: confirmed_task dtype: string - name: raw_html dtype: string - name: cleaned_html dtype: string - name: action_uid dtype: string - name: operation dtype: string - name: code dtype: string - name: cur_actions_desc dtype: string - name: cur_actions_reprs dtype: string - name: pos_candidates dtype: string - name: prev_actions_desc dtype: string - name: prev_actions_reprs dtype: string splits: - name: train num_bytes: 105415456 num_examples: 199 download_size: 17218698 dataset_size: 105415456 configs: - config_name: default data_files: - split: train path: data/train-* ---
Nandini82/sciq-qa
--- dataset_info: features: - name: question dtype: string - name: choices dtype: string - name: lecture dtype: string splits: - name: train num_bytes: 4884610.337186403 num_examples: 9448 - name: validation num_bytes: 407573.096 num_examples: 799 - name: test num_bytes: 416782.37 num_examples: 799 download_size: 3766000 dataset_size: 5708965.803186403 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
ranimeree/mixed_data
--- dataset_info: features: - name: pixel_values dtype: image - name: label dtype: image splits: - name: train num_bytes: 1898147814.73 num_examples: 11078 - name: validation num_bytes: 61731350.0 num_examples: 352 - name: test num_bytes: 61079974.0 num_examples: 348 download_size: 1432353431 dataset_size: 2020959138.73 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
AdapterOcean/Open_Platypus_standardized_cluster_6_std
--- dataset_info: features: - name: message dtype: string - name: message_type dtype: string - name: message_id dtype: int64 - name: conversation_id dtype: int64 - name: cluster dtype: float64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 2339198 num_examples: 6042 download_size: 1112562 dataset_size: 2339198 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "Open_Platypus_standardized_cluster_6_std" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Yah216/APCD-Poem_Rawiy_detection
--- language: - ar task_categories: - text-classification --- # AutoTrain Dataset for project: Poem_Rawiy_detection ## Dataset Descritpion We used the APCD dataset cited hereafter for pretraining the model. The dataset has been cleaned and only the main text and the Qafiyah columns were kept: ``` @Article{Yousef2019LearningMetersArabicEnglish-arxiv, author = {Yousef, Waleed A. and Ibrahime, Omar M. and Madbouly, Taha M. and Mahmoud, Moustafa A.}, title = {Learning Meters of Arabic and English Poems With Recurrent Neural Networks: a Step Forward for Language Understanding and Synthesis}, journal = {arXiv preprint arXiv:1905.05700}, year = 2019, url = {https://github.com/hci-lab/LearningMetersPoems} } ``` ### Languages The BCP-47 code for the dataset's language is ar. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "text": "\u0643\u0644\u0651\u064c \u064a\u064e\u0632\u0648\u0644\u064f \u0633\u064e\u0631\u064a\u0639\u0627\u064b \u0644\u0627 \u062b\u064e\u0628\u0627\u062a\u064e \u0644\u0647\u064f \u0641\u0643\u064f\u0646 \u0644\u0650\u0648\u064e\u0642\u062a\u0643\u064e \u064a\u0627 \u0645\u0650\u0633\u0643\u064a\u0646\u064f \u0645\u064f\u063a\u062a\u064e\u0646\u0650\u0645\u0627", "target": 27 }, { "text": "\u0648\u0642\u062f \u0623\u0628\u0631\u0632\u064e \u0627\u0644\u0631\u0651\u064f\u0645\u0651\u064e\u0627\u0646\u064f \u0644\u0644\u0637\u0631\u0641\u0650 \u063a\u064f\u0635\u0652\u0646\u064e\u0647\u064f \u0646\u0647\u0648\u062f\u0627\u064b \u062a\u064f\u0635\u0627\u0646\u064f \u0627\u0644\u0644\u0645\u0633\u064e \u0639\u0646 \u0643\u0641\u0651\u0650 \u0623\u062d\u0645\u0642\u0650", "target": 23 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "text": "Value(dtype='string', id=None)", "target": "ClassLabel(num_classes=35, names=['\u0621', '\u0624', '\u0627', '\u0628', '\u062a', '\u062b', '\u062c', '\u062d', '\u062e', '\u062f', '\u0630', '\u0631', '\u0632', '\u0633', '\u0634', '\u0635', '\u0636', '\u0637', '\u0637\u0646', '\u0638', '\u0639', '\u063a', '\u0641', '\u0642', '\u0643', '\u0644', '\u0644\u0627', '\u0645', '\u0646', '\u0647', '\u0647\u0640', '\u0647\u0646', '\u0648', '\u0649', '\u064a'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 1347718 | | valid | 336950 |
nahidhasannascenia/Bangla-Instruct-Llama_alpaca-llama2-chat
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 2632731 num_examples: 1185 download_size: 520404 dataset_size: 2632731 configs: - config_name: default data_files: - split: train path: data/train-* ---
BadreddineHug/3s_librispeech_subset
--- dataset_info: features: - name: file dtype: string - name: audio struct: - name: array sequence: float64 - name: path dtype: string - name: sampling_rate dtype: int64 - name: text dtype: string - name: speaker_id dtype: int64 - name: chapter_id dtype: int64 - name: id dtype: string splits: - name: train num_bytes: 1832833 num_examples: 6 download_size: 518341 dataset_size: 1832833 configs: - config_name: default data_files: - split: train path: data/train-* ---
Sefaria/links
--- license: gpl-3.0 ---
ttaront/filtered_clx
--- configs: - config_name: en data_files: "en/*.parquet" - config_name: ja data_files: "ja/*.parquet" pretty_name: filtered_clx language: - en - ja ---
Tngarg/hindi_train
--- dataset_info: features: - name: 'Unnamed: 0' dtype: int64 - name: text dtype: string - name: sentiment dtype: string - name: label dtype: int64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 318038 num_examples: 2212 download_size: 202043 dataset_size: 318038 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "hindi_train" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
aoome123/haha
--- dataset_info: config_name: aoome123/use features: - name: input_features sequence: sequence: float32 - name: labels sequence: int64 splits: - name: train num_bytes: 5462003720 num_examples: 5687 - name: test num_bytes: 682871112 num_examples: 711 - name: valid num_bytes: 682869512 num_examples: 711 download_size: 902052782 dataset_size: 6827744344 configs: - config_name: aoome123/use data_files: - split: train path: aoome123/use/train-* - split: test path: aoome123/use/test-* - split: valid path: aoome123/use/valid-* --- # Dataset Card for "important" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
test-org-usm3d/test-gated
--- extra_gated_prompt: "You agree to not use the dataset to conduct experiments that cause harm to human subjects." extra_gated_fields: Company: text Country: country Specific date: date_picker I want to use this dataset for: type: select options: - Research - Education - label: Other value: other I agree to use this dataset for non-commercial use ONLY: checkbox extra_gated_heading: "Acknowledge license to accept the repository" extra_gated_description: "Our team may take 2-3 days to process your request" extra_gated_button_content: "Acknowledge license" ---
PeterLawrence/processed_demo
--- dataset_info: features: - name: messages list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 26986 num_examples: 34 download_size: 5785 dataset_size: 26986 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "processed_demo" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vietgpt-archive/legal_document_vi
--- dataset_info: features: - name: subject dtype: string - name: meta struct: - name: effective_date dtype: string - name: issuing_agency dtype: string - name: promulgation_date dtype: string - name: sign_number dtype: string - name: signer dtype: string - name: type dtype: string - name: url dtype: string - name: text dtype: string splits: - name: train num_bytes: 7634485791 num_examples: 424187 download_size: 2642253186 dataset_size: 7634485791 --- # Dataset Card for "legal_document_vi" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
alvarobartt/stack-exchange-paired-mini
--- dataset_info: features: - name: qid dtype: int64 - name: question dtype: string - name: date dtype: string - name: metadata sequence: string - name: response_j dtype: string - name: response_k dtype: string splits: - name: train num_bytes: 335534 num_examples: 100 download_size: 105377 dataset_size: 335534 configs: - config_name: default data_files: - split: train path: data/train-* task_categories: - text-generation - question-answering language: - en size_categories: - n<1K --- # StackExchange Paired Mini (100 samples) This is a subset of the `StackExchange Paired` [lvwerra/stack-exchange-paired](https://hf.co/lvwerra/stack-exchange-paired) dataset. ## Disclaimer For licensing or any other related detail, please refer to the original dataset linked above.
Coco3384/name_of_your_dataset
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': test '1': train splits: - name: train num_bytes: 1260861060.079 num_examples: 10479 - name: validation num_bytes: 5199.0 num_examples: 1 - name: test num_bytes: 262109400.0 num_examples: 2800 download_size: 1195043893 dataset_size: 1522975659.079 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
TheFinAI/flare-causal20-sc
--- dataset_info: features: - name: id dtype: string - name: query dtype: string - name: answer dtype: string - name: text dtype: string - name: choices sequence: string - name: gold dtype: int64 splits: - name: test num_bytes: 6914334 num_examples: 8628 download_size: 2501268 dataset_size: 6914334 --- # Dataset Card for "flare-causal20-sc" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mbeaty2/data
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 2929210.0 num_examples: 127 download_size: 1529434 dataset_size: 2929210.0 --- # Dataset Card for "data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
result-kand2-sdxl-wuerst-karlo/0415e725
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 163 num_examples: 10 download_size: 1335 dataset_size: 163 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "0415e725" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Akshita15/cubelelo_data
--- dataset_info: features: - name: text dtype: string - name: inputs struct: - name: text dtype: string - name: prediction dtype: 'null' - name: prediction_agent dtype: 'null' - name: annotation dtype: string - name: annotation_agent dtype: string - name: vectors struct: - name: paraphrase-multilingual-mpnet-base-v2 sequence: float64 - name: multi_label dtype: bool - name: explanation dtype: 'null' - name: id dtype: string - name: metadata dtype: 'null' - name: status dtype: string - name: event_timestamp dtype: timestamp[us] - name: metrics struct: - name: text_length dtype: int64 splits: - name: train num_bytes: 12103055 num_examples: 1922 download_size: 9829799 dataset_size: 12103055 --- # Dataset Card for "cubelelo_data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
NYUMets/nyumets_brats
--- license: other extra_gated_heading: "NYU Langone Health NYUMets Dataset Sharing Agreement" extra_gated_prompt: "By registering for downloads from the NYUMets Dataset, I agree to this Dataset Sharing Agreement, as well as to the terms of use as posted and updated periodically at: http://nyulangone.org/policies-disclaimers/disclaimer.\nThe NYUMets Dataset is considered proprietary to and owned by New York University and NYU Langone Health (together “NYU”). Other than the rights granted herein, NYU retains all rights, title, and interest in the NYUMets Dataset.\nSubject to the provisions of this Agreement, NYU shall give to me access to and the right to download the NYUMets Dataset, and NYU hereby grants to me a non-exclusive, royalty-free license to use the NYUMets Dataset for internal research or educational purposes only and only as permitted by this Agreement. This Agreement conveys no other rights of any sort with respect to the NYUMets Dataset or the intellectual property rights embodied therein.\nI will receive AWS permissions to access the NYUMets Dataset without charge for internal research or educational purposes only. The link will permit me to download and access a verbatim copy of the NYUMets Dataset solely for such use. I will NOT SHARE THE DOWNLOADED DATA or AWS Account Access Credentials to the NYUMets Dataset with others. If another user within my organization or elsewhere wishes to obtain a copy of and use the NYUMets Dataset, they must register as an individual user and comply with all the terms of this Agreement." extra_gated_fields: Name: text Email: text Organization: text Phone Number: text By checking this box, you are certifying that you have read and understood the NYU Langone Health NYUMets Dataset Sharing Agreement: checkbox ---
boapps/kmdb_entities
--- dataset_info: features: - name: id dtype: int64 - name: people sequence: string - name: institutions sequence: string - name: places sequence: string - name: text dtype: string - name: ent_lemmas sequence: sequence: string - name: ent_tokens list: - name: text dtype: string - name: lemma dtype: string - name: tokens list: - name: i dtype: int64 - name: text dtype: string - name: lemma dtype: string - name: ent_type dtype: string - name: iob dtype: string - name: words sequence: string splits: - name: train num_bytes: 670240594 num_examples: 46914 download_size: 262010648 dataset_size: 670240594 configs: - config_name: default data_files: - split: train path: data/train-* ---
one-sec-cv12/chunk_149
--- dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 splits: - name: train num_bytes: 17601084144.375 num_examples: 183253 download_size: 15549262193 dataset_size: 17601084144.375 --- # Dataset Card for "chunk_149" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DataStudio/OCR_document_bluir_part_07
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 3085454343.625 num_examples: 175835 download_size: 3081898367 dataset_size: 3085454343.625 --- # Dataset Card for "OCR_document_bluir_part_07" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
graphs-datasets/MD17-salicylic_acid
--- licence: unknown task_categories: - graph-ml --- # Dataset Card for salicylic_acid ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [External Use](#external-use) - [PyGeometric](#pygeometric) - [Dataset Structure](#dataset-structure) - [Data Properties](#data-properties) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Additional Information](#additional-information) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **[Homepage](http://www.sgdml.org/#datasets)** - **Paper:**: (see citation) ### Dataset Summary The `salicylic_acid` dataset is a molecular dynamics (MD) dataset. The total energy and force labels for each dataset were computed using the PBE+vdW-TS electronic structure method. All geometries are in Angstrom, energies and forces are given in kcal/mol and kcal/mol/A respectively. ### Supported Tasks and Leaderboards `salicylic_acid` should be used for organic molecular property prediction, a regression task on 1 property. The score used is Mean absolute errors (in meV) for energy prediction. ## External Use ### PyGeometric To load in PyGeometric, do the following: ```python from datasets import load_dataset from torch_geometric.data import Data from torch_geometric.loader import DataLoader dataset_hf = load_dataset("graphs-datasets/<mydataset>") # For the train set (replace by valid or test as needed) dataset_pg_list = [Data(graph) for graph in dataset_hf["train"]] dataset_pg = DataLoader(dataset_pg_list) ``` ## Dataset Structure ### Data Properties | property | value | |---|---| | scale | big | | #graphs | 220231 | | average #nodes | 16.0 | | average #edges | 208.2681717461586 | ### Data Fields Each row of a given file is a graph, with: - `node_feat` (list: #nodes x #node-features): nodes - `edge_index` (list: 2 x #edges): pairs of nodes constituting edges - `edge_attr` (list: #edges x #edge-features): for the aforementioned edges, contains their features - `y` (list: #labels): contains the number of labels available to predict - `num_nodes` (int): number of nodes of the graph ### Data Splits This data is not split, and should be used with cross validation. It comes from the PyGeometric version of the dataset. ## Additional Information ### Licensing Information The dataset has been released under license unknown. ### Citation Information ``` @inproceedings{Morris+2020, title={TUDataset: A collection of benchmark datasets for learning with graphs}, author={Christopher Morris and Nils M. Kriege and Franka Bause and Kristian Kersting and Petra Mutzel and Marion Neumann}, booktitle={ICML 2020 Workshop on Graph Representation Learning and Beyond (GRL+ 2020)}, archivePrefix={arXiv}, eprint={2007.08663}, url={www.graphlearning.io}, year={2020} } ``` ``` @article{Chmiela_2017, doi = {10.1126/sciadv.1603015}, url = {https://doi.org/10.1126%2Fsciadv.1603015}, year = 2017, month = {may}, publisher = {American Association for the Advancement of Science ({AAAS})}, volume = {3}, number = {5}, author = {Stefan Chmiela and Alexandre Tkatchenko and Huziel E. Sauceda and Igor Poltavsky and Kristof T. Schütt and Klaus-Robert Müller}, title = {Machine learning of accurate energy-conserving molecular force fields}, journal = {Science Advances} } ```
ademax/ocr_scan_vi_01
--- language: vi dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 410862389.5689411 num_examples: 11003 - name: test num_bytes: 37340942.4310589 num_examples: 1000 download_size: 447854730 dataset_size: 448203332.0 --- # Dataset Card for "quan_ocr_data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/VALUE_qqp_dey_it
--- dataset_info: features: - name: question1 dtype: string - name: question2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 331535 num_examples: 1870 - name: test num_bytes: 3266804 num_examples: 18707 - name: train num_bytes: 2881908 num_examples: 16269 download_size: 4036549 dataset_size: 6480247 --- # Dataset Card for "VALUE_qqp_dey_it" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
autoevaluate/autoeval-eval-futin__feed-top_vi-71f14a-2175469967
--- type: predictions tags: - autotrain - evaluation datasets: - futin/feed eval_info: task: text_zero_shot_classification model: facebook/opt-1.3b metrics: [] dataset_name: futin/feed dataset_config: top_vi dataset_split: test col_mapping: text: text classes: classes target: target --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Zero-Shot Text Classification * Model: facebook/opt-1.3b * Dataset: futin/feed * Config: top_vi * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@futin](https://huggingface.co/futin) for evaluating this model.
SIA86/LFQAKnowledgeBase
--- license: openrail task_categories: - question-answering - text2text-generation language: - ru pretty_name: LFQA_KB size_categories: - n<1K dataset_info: features: - name: id dtype: int64 - name: title dtype: string - name: heading1 dtype: string - name: heading2 dtype: string - name: heading3 dtype: string - name: heading4 dtype: string - name: heading5 dtype: string - name: text dtype: string ---
mrjang/mini-platypus
--- dataset_info: features: - name: instruction dtype: string - name: output dtype: string splits: - name: train num_bytes: 4186564 num_examples: 1000 download_size: 2245921 dataset_size: 4186564 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_rufjdk5480__gov-qna-ko-merged
--- pretty_name: Evaluation run of rufjdk5480/gov-qna-ko-merged dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [rufjdk5480/gov-qna-ko-merged](https://huggingface.co/rufjdk5480/gov-qna-ko-merged)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rufjdk5480__gov-qna-ko-merged\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-05T04:31:12.088602](https://huggingface.co/datasets/open-llm-leaderboard/details_rufjdk5480__gov-qna-ko-merged/blob/main/results_2024-01-05T04-31-12.088602.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6966669201907558,\n\ \ \"acc_stderr\": 0.03014874135400622,\n \"acc_norm\": 0.7075970828086718,\n\ \ \"acc_norm_stderr\": 0.03073409651050876,\n \"mc1\": 0.2178702570379437,\n\ \ \"mc1_stderr\": 0.014450846714123892,\n \"mc2\": 0.48607139277849154,\n\ \ \"mc2_stderr\": 0.01710096370379909\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.3506825938566553,\n \"acc_stderr\": 0.013944635930726089,\n\ \ \"acc_norm\": 0.39505119453924914,\n \"acc_norm_stderr\": 0.014285898292938165\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.33917546305516827,\n\ \ \"acc_stderr\": 0.004724619193427588,\n \"acc_norm\": 0.39055964947221666,\n\ \ \"acc_norm_stderr\": 0.004868787333436588\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n\ \ \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.6888888888888889,\n\ \ \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.03016753346863271,\n\ \ \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.03016753346863271\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\ \ \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \ \ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.0254478638251086,\n\ \ \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.0254478638251086\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8680555555555556,\n\ \ \"acc_stderr\": 0.02830096838204443,\n \"acc_norm\": 0.8680555555555556,\n\ \ \"acc_norm_stderr\": 0.02830096838204443\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \ \ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\ \ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\ \ \"acc_stderr\": 0.034961014811911786,\n \"acc_norm\": 0.6994219653179191,\n\ \ \"acc_norm_stderr\": 0.034961014811911786\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n\ \ \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n\ \ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \ \ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745636,\n\ \ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745636\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n\ \ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n\ \ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.038552896163789485,\n\ \ \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.038552896163789485\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"\ acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\ \ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\ \ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8451612903225807,\n\ \ \"acc_stderr\": 0.020579287326583227,\n \"acc_norm\": 0.8451612903225807,\n\ \ \"acc_norm_stderr\": 0.020579287326583227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n\ \ \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\ : 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\ \ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880236,\n \"\ acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880236\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240524,\n\ \ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240524\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857726,\n\ \ \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857726\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \ \ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7899159663865546,\n \"acc_stderr\": 0.026461398717471874,\n\ \ \"acc_norm\": 0.7899159663865546,\n \"acc_norm_stderr\": 0.026461398717471874\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"\ acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"\ acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.6296296296296297,\n \"acc_stderr\": 0.03293377139415191,\n \"\ acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.03293377139415191\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\ acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878453,\n \ \ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878453\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n\ \ \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n\ \ \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n\ \ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\ acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\ \ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\ \ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\ \ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\ \ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n\ \ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761012,\n\ \ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761012\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\ \ \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n\ \ \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \ \ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8735632183908046,\n\ \ \"acc_stderr\": 0.011884488905895555,\n \"acc_norm\": 0.8735632183908046,\n\ \ \"acc_norm_stderr\": 0.011884488905895555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.021393961404363847,\n\ \ \"acc_norm\": 0.8034682080924855,\n \"acc_norm_stderr\": 0.021393961404363847\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n\ \ \"acc_stderr\": 0.016018239710513398,\n \"acc_norm\": 0.3564245810055866,\n\ \ \"acc_norm_stderr\": 0.016018239710513398\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514276,\n\ \ \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514276\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n\ \ \"acc_stderr\": 0.022827317491059686,\n \"acc_norm\": 0.797427652733119,\n\ \ \"acc_norm_stderr\": 0.022827317491059686\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.845679012345679,\n \"acc_stderr\": 0.020100830999850994,\n\ \ \"acc_norm\": 0.845679012345679,\n \"acc_norm_stderr\": 0.020100830999850994\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5460992907801419,\n \"acc_stderr\": 0.02970045324729147,\n \ \ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.02970045324729147\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.546284224250326,\n\ \ \"acc_stderr\": 0.012715404841277748,\n \"acc_norm\": 0.546284224250326,\n\ \ \"acc_norm_stderr\": 0.012715404841277748\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7977941176470589,\n \"acc_stderr\": 0.024398192986654924,\n\ \ \"acc_norm\": 0.7977941176470589,\n \"acc_norm_stderr\": 0.024398192986654924\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.7859477124183006,\n \"acc_stderr\": 0.016593429662329035,\n \ \ \"acc_norm\": 0.7859477124183006,\n \"acc_norm_stderr\": 0.016593429662329035\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\ \ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\ \ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\ \ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\ \ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\ \ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \ \ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\ \ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n\ \ \"mc1_stderr\": 0.014450846714123892,\n \"mc2\": 0.48607139277849154,\n\ \ \"mc2_stderr\": 0.01710096370379909\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5674822415153907,\n \"acc_stderr\": 0.013923911578623823\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2767247915087187,\n \ \ \"acc_stderr\": 0.012323047397959787\n }\n}\n```" repo_url: https://huggingface.co/rufjdk5480/gov-qna-ko-merged leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|arc:challenge|25_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-05T04-31-12.088602.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|gsm8k|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hellaswag|10_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-31-12.088602.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-management|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-31-12.088602.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|truthfulqa:mc|0_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-05T04-31-12.088602.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_05T04_31_12.088602 path: - '**/details_harness|winogrande|5_2024-01-05T04-31-12.088602.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-05T04-31-12.088602.parquet' - config_name: results data_files: - split: 2024_01_05T04_31_12.088602 path: - results_2024-01-05T04-31-12.088602.parquet - split: latest path: - results_2024-01-05T04-31-12.088602.parquet --- # Dataset Card for Evaluation run of rufjdk5480/gov-qna-ko-merged <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [rufjdk5480/gov-qna-ko-merged](https://huggingface.co/rufjdk5480/gov-qna-ko-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_rufjdk5480__gov-qna-ko-merged", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-05T04:31:12.088602](https://huggingface.co/datasets/open-llm-leaderboard/details_rufjdk5480__gov-qna-ko-merged/blob/main/results_2024-01-05T04-31-12.088602.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6966669201907558, "acc_stderr": 0.03014874135400622, "acc_norm": 0.7075970828086718, "acc_norm_stderr": 0.03073409651050876, "mc1": 0.2178702570379437, "mc1_stderr": 0.014450846714123892, "mc2": 0.48607139277849154, "mc2_stderr": 0.01710096370379909 }, "harness|arc:challenge|25": { "acc": 0.3506825938566553, "acc_stderr": 0.013944635930726089, "acc_norm": 0.39505119453924914, "acc_norm_stderr": 0.014285898292938165 }, "harness|hellaswag|10": { "acc": 0.33917546305516827, "acc_stderr": 0.004724619193427588, "acc_norm": 0.39055964947221666, "acc_norm_stderr": 0.004868787333436588 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6888888888888889, "acc_stderr": 0.039992628766177214, "acc_norm": 0.6888888888888889, "acc_norm_stderr": 0.039992628766177214 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8355263157894737, "acc_stderr": 0.03016753346863271, "acc_norm": 0.8355263157894737, "acc_norm_stderr": 0.03016753346863271 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720683, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720683 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7811320754716982, "acc_stderr": 0.0254478638251086, "acc_norm": 0.7811320754716982, "acc_norm_stderr": 0.0254478638251086 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8680555555555556, "acc_stderr": 0.02830096838204443, "acc_norm": 0.8680555555555556, "acc_norm_stderr": 0.02830096838204443 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6994219653179191, "acc_stderr": 0.034961014811911786, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.034961014811911786 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5, "acc_stderr": 0.04975185951049946, "acc_norm": 0.5, "acc_norm_stderr": 0.04975185951049946 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6680851063829787, "acc_stderr": 0.030783736757745636, "acc_norm": 0.6680851063829787, "acc_norm_stderr": 0.030783736757745636 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6403508771929824, "acc_stderr": 0.04514496132873633, "acc_norm": 0.6403508771929824, "acc_norm_stderr": 0.04514496132873633 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6896551724137931, "acc_stderr": 0.038552896163789485, "acc_norm": 0.6896551724137931, "acc_norm_stderr": 0.038552896163789485 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47883597883597884, "acc_stderr": 0.025728230952130726, "acc_norm": 0.47883597883597884, "acc_norm_stderr": 0.025728230952130726 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5238095238095238, "acc_stderr": 0.04467062628403273, "acc_norm": 0.5238095238095238, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8451612903225807, "acc_stderr": 0.020579287326583227, "acc_norm": 0.8451612903225807, "acc_norm_stderr": 0.020579287326583227 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6354679802955665, "acc_stderr": 0.0338640574606209, "acc_norm": 0.6354679802955665, "acc_norm_stderr": 0.0338640574606209 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8121212121212121, "acc_stderr": 0.03050193405942914, "acc_norm": 0.8121212121212121, "acc_norm_stderr": 0.03050193405942914 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8737373737373737, "acc_stderr": 0.023664359402880236, "acc_norm": 0.8737373737373737, "acc_norm_stderr": 0.023664359402880236 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9378238341968912, "acc_stderr": 0.017426974154240524, "acc_norm": 0.9378238341968912, "acc_norm_stderr": 0.017426974154240524 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7076923076923077, "acc_stderr": 0.023060438380857726, "acc_norm": 0.7076923076923077, "acc_norm_stderr": 0.023060438380857726 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.02944316932303154, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.02944316932303154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7899159663865546, "acc_stderr": 0.026461398717471874, "acc_norm": 0.7899159663865546, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4966887417218543, "acc_stderr": 0.04082393379449654, "acc_norm": 0.4966887417218543, "acc_norm_stderr": 0.04082393379449654 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8844036697247707, "acc_stderr": 0.01370874953417264, "acc_norm": 0.8844036697247707, "acc_norm_stderr": 0.01370874953417264 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6296296296296297, "acc_stderr": 0.03293377139415191, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.03293377139415191 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8627450980392157, "acc_stderr": 0.024152225962801588, "acc_norm": 0.8627450980392157, "acc_norm_stderr": 0.024152225962801588 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8818565400843882, "acc_stderr": 0.021011052659878453, "acc_norm": 0.8818565400843882, "acc_norm_stderr": 0.021011052659878453 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7713004484304933, "acc_stderr": 0.028188240046929203, "acc_norm": 0.7713004484304933, "acc_norm_stderr": 0.028188240046929203 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8244274809160306, "acc_stderr": 0.03336820338476074, "acc_norm": 0.8244274809160306, "acc_norm_stderr": 0.03336820338476074 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8512396694214877, "acc_stderr": 0.03248470083807194, "acc_norm": 0.8512396694214877, "acc_norm_stderr": 0.03248470083807194 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.03602814176392645, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.03602814176392645 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5982142857142857, "acc_stderr": 0.04653333146973647, "acc_norm": 0.5982142857142857, "acc_norm_stderr": 0.04653333146973647 }, "harness|hendrycksTest-management|5": { "acc": 0.8640776699029126, "acc_stderr": 0.03393295729761012, "acc_norm": 0.8640776699029126, "acc_norm_stderr": 0.03393295729761012 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9188034188034188, "acc_stderr": 0.017893784904018533, "acc_norm": 0.9188034188034188, "acc_norm_stderr": 0.017893784904018533 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.78, "acc_stderr": 0.041633319989322626, "acc_norm": 0.78, "acc_norm_stderr": 0.041633319989322626 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8735632183908046, "acc_stderr": 0.011884488905895555, "acc_norm": 0.8735632183908046, "acc_norm_stderr": 0.011884488905895555 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8034682080924855, "acc_stderr": 0.021393961404363847, "acc_norm": 0.8034682080924855, "acc_norm_stderr": 0.021393961404363847 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3564245810055866, "acc_stderr": 0.016018239710513398, "acc_norm": 0.3564245810055866, "acc_norm_stderr": 0.016018239710513398 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.826797385620915, "acc_stderr": 0.021668400256514276, "acc_norm": 0.826797385620915, "acc_norm_stderr": 0.021668400256514276 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.797427652733119, "acc_stderr": 0.022827317491059686, "acc_norm": 0.797427652733119, "acc_norm_stderr": 0.022827317491059686 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.845679012345679, "acc_stderr": 0.020100830999850994, "acc_norm": 0.845679012345679, "acc_norm_stderr": 0.020100830999850994 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5460992907801419, "acc_stderr": 0.02970045324729147, "acc_norm": 0.5460992907801419, "acc_norm_stderr": 0.02970045324729147 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.546284224250326, "acc_stderr": 0.012715404841277748, "acc_norm": 0.546284224250326, "acc_norm_stderr": 0.012715404841277748 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7977941176470589, "acc_stderr": 0.024398192986654924, "acc_norm": 0.7977941176470589, "acc_norm_stderr": 0.024398192986654924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7859477124183006, "acc_stderr": 0.016593429662329035, "acc_norm": 0.7859477124183006, "acc_norm_stderr": 0.016593429662329035 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.04309118709946458, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.04309118709946458 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8040816326530612, "acc_stderr": 0.025409301953225678, "acc_norm": 0.8040816326530612, "acc_norm_stderr": 0.025409301953225678 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8855721393034826, "acc_stderr": 0.022509345325101706, "acc_norm": 0.8855721393034826, "acc_norm_stderr": 0.022509345325101706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.028762349126466125, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466125 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8830409356725146, "acc_stderr": 0.024648068961366152, "acc_norm": 0.8830409356725146, "acc_norm_stderr": 0.024648068961366152 }, "harness|truthfulqa:mc|0": { "mc1": 0.2178702570379437, "mc1_stderr": 0.014450846714123892, "mc2": 0.48607139277849154, "mc2_stderr": 0.01710096370379909 }, "harness|winogrande|5": { "acc": 0.5674822415153907, "acc_stderr": 0.013923911578623823 }, "harness|gsm8k|5": { "acc": 0.2767247915087187, "acc_stderr": 0.012323047397959787 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
IndonesiaAI/sft-dataset
--- dataset_info: features: - name: qid dtype: string - name: messages list: - name: content dtype: string - name: role dtype: string splits: - name: train num_bytes: 7696896445.5759535 num_examples: 3798835 - name: test num_bytes: 855211166.424046 num_examples: 422093 download_size: 4782631024 dataset_size: 8552107612.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-4000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 999403 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
Ahmed167/floor-plans-dataset
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 1790707.0 num_examples: 31 download_size: 1747568 dataset_size: 1790707.0 --- # Dataset Card for "floor-plans-dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
NLPC-UOM/Travel-Dataset-5000
--- language: - en license: - mit --- This question base consits of 5000 travel domain based questions which are being annotated under a taxonomy related to the travel domain. The taxonomy is a hierarchical taxonomy with two levels of 7 coarse classes and 63 fine classes. 5000TravelQuestionsDataset.xlsx file consists of the annotated question base and the taxonomy. For the question base only use 5000TravelQuestionsDataset.csv file. If you use this data set in your reserch work, cite it as Kahaduwa, H., Pathirana, D., Arachchi, P.L., Dias, V., Ranathunga, S. and Kohomban, U., 2017, May. Question Answering system for the travel domain. In Engineering Research Conference (MERCon), 2017 Moratuwa (pp. 449-454). IEEE. If you need more clarifications please contact through following email addresses. Pathum - pathum.12@cse.mrt.ac.lk Dilshan -pathirana.12@cse.mrt.ac.lk Hasangi - hasangik.12@cse.mrt.ac.lk Vishma - vishma.12@cse.mrt.ac.lk
ZurabDz/bart_tokenized_data_bpe_byte_level
--- dataset_info: features: - name: input_ids sequence: int32 splits: - name: train num_bytes: 704227420 num_examples: 2708567 download_size: 437699986 dataset_size: 704227420 --- # Dataset Card for "bart_tokenized_data_bpe_byte_level" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
csebuetnlp/illusionVQA-Soft-Localization
--- language: - en size_categories: - n<1K task_categories: - image-to-text - visual-question-answering dataset_info: features: - name: image dtype: image - name: question dtype: string - name: options sequence: string - name: answer dtype: string - name: category dtype: string - name: reasoning dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 50339 num_examples: 4 - name: test num_bytes: 24579079 num_examples: 1000 download_size: 24495650 dataset_size: 24629418 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* license: cc-by-nc-sa-4.0 --- # IllusionVQA: Optical Illusion Dataset [Project Page](https://illusionvqa.github.io/) | [Paper](https://arxiv.org/abs/2403.15952) | [Github](https://github.com/csebuetnlp/IllusionVQA/) ## TL;DR IllusionVQA is a dataset of optical illusions and hard-to-interpret scenes designed to test the capability of Vision Language Models in comprehension and soft localization tasks. GPT4V achieved 62.99% accuracy on comprehension and 49.7% on localization, while humans achieved 91.03% and 100% respectively. ## Usage ```python from datasets import load_dataset import base64 from openai import OpenAI import os os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY" def encode_image(pil_image): temp_name = "temp.jpg" pil_image.save(temp_name) with open(temp_name, "rb") as image_file: return base64.b64encode(image_file.read()).decode("utf-8") def construct_mcq(options, correct_option): correct_option_letter = None i = "a" mcq = "" for option in options: if option == correct_option: correct_option_letter = i mcq += f"{i}. {option}\n" i = chr(ord(i) + 1) mcq = mcq[:-1] return mcq, correct_option_letter def add_row(content, data, i, with_answer=False): mcq, correct_option_letter = construct_mcq(data["options"], data["answer"]) content.append({ "type": "text", "text": "Image " + str(i) + ": " + data["question"] + "\n" + mcq }) content.append({ "type": "image_url", "image_url": {"url": f"data:image/jpeg;base64,{encode_image(data['image'])}", "detail": "low"}}) if with_answer: content.append({"type": "text", "text": "Answer {}: ".format(i) + correct_option_letter}) else: content.append({"type": "text", "text": "Answer {}: ".format(i), }) return content dataset = load_dataset("csebuetnlp/illusionVQA-Comprehension") client = OpenAI(api_key=os.getenv("OPENAI_API_KEY")) content = [{ "type": "text", "text": "You'll be given an image, an instruction and some choices. You have to select the correct one. Do not explain your reasoning. Answer with the option's letter from the given choices directly. Here are a few examples:", }] ### Add a few examples for i, data in enumerate(dataset["train"], 1): content = add_row(content, data, i, with_answer=True) content.append({"type": "text", "text": "Now you try it!",}) next_idx = i + 1 ### Add the test data test_data = dataset["test"][0] content_t = add_row(content.copy(), test_data, next_idx, with_answer=False) ### Get the answer from GPT-4 response = client.chat.completions.create( model="gpt-4-vision-preview", messages=[{"role": "user","content": content_t,}], max_tokens=5, ) gpt4_answer = response.choices[0].message.content print(gpt4_answer) ``` ## License This dataset is made available for non-commercial research purposes only under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/). The dataset may not be used for training models. The dataset contains images collected from the internet. While permission has been obtained from some of the images' creators, permission has not yet been received from all creators. If you believe any image in this dataset is used without proper permission and you are the copyright holder, please email <a href="mailto:sameen2080@gmail.com">Haz Sameen Shahgir</a> to request the removal of the image from the dataset. The dataset creator makes no representations or warranties regarding the copyright status of the images in the dataset. The dataset creator shall not be held liable for any unauthorized use of copyrighted material that may be contained in the dataset. You agree to the terms and conditions specified in this license by downloading or using this dataset. If you do not agree with these terms, do not download or use the dataset. <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png" /></a> ### Citation ``` @article{shahgir2024illusionvqa, title={IllusionVQA: A Challenging Optical Illusion Dataset for Vision Language Models}, author={Haz Sameen Shahgir and Khondker Salman Sayeed and Abhik Bhattacharjee and Wasi Uddin Ahmad and Yue Dong and Rifat Shahriyar}, year={2024}, url={https://arxiv.org/abs/2403.15952}, } ```
nicolof88/spider_train_prompts
--- license: apache-2.0 ---
qgallouedec/prj_gia_dataset_metaworld_handle_pull_side_v2_1111
--- library_name: gia tags: - deep-reinforcement-learning - reinforcement-learning - gia - multi-task - multi-modal - imitation-learning - offline-reinforcement-learning --- An imitation learning environment for the handle-pull-side-v2 environment, sample for the policy handle-pull-side-v2 This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia ## Load dataset First, clone it with ```sh git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_handle_pull_side_v2_1111 ``` Then, load it with ```python import numpy as np dataset = np.load("prj_gia_dataset_metaworld_handle_pull_side_v2_1111/dataset.npy", allow_pickle=True).item() print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards']) ```
open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch
--- pretty_name: Evaluation run of TFLai/Luban-Platypus2-13B-QLora-0.80-epoch dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [TFLai/Luban-Platypus2-13B-QLora-0.80-epoch](https://huggingface.co/TFLai/Luban-Platypus2-13B-QLora-0.80-epoch)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-22T15:34:59.483409](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch/blob/main/results_2023-10-22T15-34-59.483409.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.022546140939597316,\n\ \ \"em_stderr\": 0.0015202810875087171,\n \"f1\": 0.12950398489932863,\n\ \ \"f1_stderr\": 0.002336366110485991,\n \"acc\": 0.3814234073910959,\n\ \ \"acc_stderr\": 0.0073618459091066004\n },\n \"harness|drop|3\":\ \ {\n \"em\": 0.022546140939597316,\n \"em_stderr\": 0.0015202810875087171,\n\ \ \"f1\": 0.12950398489932863,\n \"f1_stderr\": 0.002336366110485991\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \ \ \"acc_stderr\": 0.002615326510775673\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437528\n\ \ }\n}\n```" repo_url: https://huggingface.co/TFLai/Luban-Platypus2-13B-QLora-0.80-epoch leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|arc:challenge|25_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-30T01:02:30.667173.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_22T15_34_59.483409 path: - '**/details_harness|drop|3_2023-10-22T15-34-59.483409.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-22T15-34-59.483409.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_22T15_34_59.483409 path: - '**/details_harness|gsm8k|5_2023-10-22T15-34-59.483409.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-22T15-34-59.483409.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hellaswag|10_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-30T01:02:30.667173.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-management|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T01:02:30.667173.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_30T01_02_30.667173 path: - '**/details_harness|truthfulqa:mc|0_2023-08-30T01:02:30.667173.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-30T01:02:30.667173.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_22T15_34_59.483409 path: - '**/details_harness|winogrande|5_2023-10-22T15-34-59.483409.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-22T15-34-59.483409.parquet' - config_name: results data_files: - split: 2023_08_30T01_02_30.667173 path: - results_2023-08-30T01:02:30.667173.parquet - split: 2023_10_22T15_34_59.483409 path: - results_2023-10-22T15-34-59.483409.parquet - split: latest path: - results_2023-10-22T15-34-59.483409.parquet --- # Dataset Card for Evaluation run of TFLai/Luban-Platypus2-13B-QLora-0.80-epoch ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TFLai/Luban-Platypus2-13B-QLora-0.80-epoch - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [TFLai/Luban-Platypus2-13B-QLora-0.80-epoch](https://huggingface.co/TFLai/Luban-Platypus2-13B-QLora-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T15:34:59.483409](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch/blob/main/results_2023-10-22T15-34-59.483409.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.022546140939597316, "em_stderr": 0.0015202810875087171, "f1": 0.12950398489932863, "f1_stderr": 0.002336366110485991, "acc": 0.3814234073910959, "acc_stderr": 0.0073618459091066004 }, "harness|drop|3": { "em": 0.022546140939597316, "em_stderr": 0.0015202810875087171, "f1": 0.12950398489932863, "f1_stderr": 0.002336366110485991 }, "harness|gsm8k|5": { "acc": 0.009097801364670205, "acc_stderr": 0.002615326510775673 }, "harness|winogrande|5": { "acc": 0.7537490134175217, "acc_stderr": 0.012108365307437528 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
Seanxh/twitter_dataset_1713086345
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 23637 num_examples: 53 download_size: 13242 dataset_size: 23637 configs: - config_name: default data_files: - split: train path: data/train-* ---
ZHLiu627/ultrafeedback_binarized_with_response_full_part0
--- dataset_info: features: - name: prompt dtype: string - name: prompt_id dtype: string - name: chosen list: - name: content dtype: string - name: role dtype: string - name: rejected list: - name: content dtype: string - name: role dtype: string - name: messages list: - name: content dtype: string - name: role dtype: string - name: score_chosen dtype: float64 - name: score_rejected dtype: float64 - name: reference_response dtype: string splits: - name: train_prefs num_bytes: 165761185 num_examples: 20000 download_size: 92065089 dataset_size: 165761185 configs: - config_name: default data_files: - split: train_prefs path: data/train_prefs-* --- # Dataset Card for "ultrafeedback_binarized_with_response_full_part0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DONG19/CoT_instruction_20k
--- dataset_info: features: - name: instruction dtype: string - name: output dtype: string - name: input dtype: string splits: - name: train num_bytes: 6132650 num_examples: 20022 download_size: 0 dataset_size: 6132650 --- # Dataset Card for "CoT_instruction_20k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Delcos__Velara
--- pretty_name: Evaluation run of Delcos/Velara dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Delcos/Velara](https://huggingface.co/Delcos/Velara) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Delcos__Velara\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-08T00:16:45.141900](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Velara/blob/main/results_2023-12-08T00-16-45.141900.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5941805681088884,\n\ \ \"acc_stderr\": 0.03328213036591988,\n \"acc_norm\": 0.5983564094269671,\n\ \ \"acc_norm_stderr\": 0.03395331581770101,\n \"mc1\": 0.2778457772337821,\n\ \ \"mc1_stderr\": 0.015680929364024637,\n \"mc2\": 0.44699355725588724,\n\ \ \"mc2_stderr\": 0.015255919110214552\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\ \ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642664\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6477793268273252,\n\ \ \"acc_stderr\": 0.004766860907171539,\n \"acc_norm\": 0.8283210515833499,\n\ \ \"acc_norm_stderr\": 0.00376330474609875\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\ \ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\ \ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\ \ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\ \ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n\ \ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\ \ \"acc_stderr\": 0.03899073687357334,\n \"acc_norm\": 0.6805555555555556,\n\ \ \"acc_norm_stderr\": 0.03899073687357334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\ \ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\ \ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\ \ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\ \ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\ \ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n\ \ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\ \ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400513,\n \"\ acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400513\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\ \ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\ \ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\ \ \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n\ \ \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\ \ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\ \ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\ acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\ \ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.02504919787604234,\n \ \ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.02504919787604234\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\"\ : 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\ : {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\ \ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\ acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.763302752293578,\n \"acc_stderr\": 0.0182240781172991,\n \"acc_norm\"\ : 0.763302752293578,\n \"acc_norm_stderr\": 0.0182240781172991\n },\n\ \ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n\ \ \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n\ \ \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\ : {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n\ \ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808517,\n \ \ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808517\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\ \ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\ acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\ \ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\ \ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\ \ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\ \ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\ \ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\ \ \"acc_stderr\": 0.014805384478371151,\n \"acc_norm\": 0.7803320561941252,\n\ \ \"acc_norm_stderr\": 0.014805384478371151\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n\ \ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\ \ \"acc_stderr\": 0.0142426300705749,\n \"acc_norm\": 0.23798882681564246,\n\ \ \"acc_norm_stderr\": 0.0142426300705749\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388856,\n\ \ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388856\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\ \ \"acc_stderr\": 0.02685882587948853,\n \"acc_norm\": 0.662379421221865,\n\ \ \"acc_norm_stderr\": 0.02685882587948853\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\ \ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829707,\n \ \ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829707\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\ \ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\ \ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\ \ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6209150326797386,\n \"acc_stderr\": 0.01962744474841223,\n \ \ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.01962744474841223\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\ \ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\ \ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\ \ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\ \ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\ \ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \ \ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\ \ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\ \ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\ \ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\ \ \"mc1_stderr\": 0.015680929364024637,\n \"mc2\": 0.44699355725588724,\n\ \ \"mc2_stderr\": 0.015255919110214552\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7379636937647988,\n \"acc_stderr\": 0.012358944431637563\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40333586050037906,\n \ \ \"acc_stderr\": 0.013512654781814687\n }\n}\n```" repo_url: https://huggingface.co/Delcos/Velara leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|arc:challenge|25_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-08T00-16-45.141900.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|gsm8k|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hellaswag|10_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-08T00-16-45.141900.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-management|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T00-16-45.141900.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|truthfulqa:mc|0_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-08T00-16-45.141900.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_08T00_16_45.141900 path: - '**/details_harness|winogrande|5_2023-12-08T00-16-45.141900.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-08T00-16-45.141900.parquet' - config_name: results data_files: - split: 2023_12_08T00_16_45.141900 path: - results_2023-12-08T00-16-45.141900.parquet - split: latest path: - results_2023-12-08T00-16-45.141900.parquet --- # Dataset Card for Evaluation run of Delcos/Velara ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Delcos/Velara - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Delcos/Velara](https://huggingface.co/Delcos/Velara) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Delcos__Velara", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-08T00:16:45.141900](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Velara/blob/main/results_2023-12-08T00-16-45.141900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5941805681088884, "acc_stderr": 0.03328213036591988, "acc_norm": 0.5983564094269671, "acc_norm_stderr": 0.03395331581770101, "mc1": 0.2778457772337821, "mc1_stderr": 0.015680929364024637, "mc2": 0.44699355725588724, "mc2_stderr": 0.015255919110214552 }, "harness|arc:challenge|25": { "acc": 0.5708191126279863, "acc_stderr": 0.014464085894870653, "acc_norm": 0.5895904436860068, "acc_norm_stderr": 0.014374922192642664 }, "harness|hellaswag|10": { "acc": 0.6477793268273252, "acc_stderr": 0.004766860907171539, "acc_norm": 0.8283210515833499, "acc_norm_stderr": 0.00376330474609875 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.03910525752849724, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.03910525752849724 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6075471698113207, "acc_stderr": 0.030052580579557845, "acc_norm": 0.6075471698113207, "acc_norm_stderr": 0.030052580579557845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6805555555555556, "acc_stderr": 0.03899073687357334, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.03899073687357334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5549132947976878, "acc_stderr": 0.03789401760283647, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.03789401760283647 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.032650194750335815, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.032650194750335815 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4298245614035088, "acc_stderr": 0.04657047260594964, "acc_norm": 0.4298245614035088, "acc_norm_stderr": 0.04657047260594964 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4365079365079365, "acc_stderr": 0.025542846817400513, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.025542846817400513 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768177, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768177 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6806451612903226, "acc_stderr": 0.026522709674667768, "acc_norm": 0.6806451612903226, "acc_norm_stderr": 0.026522709674667768 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4630541871921182, "acc_stderr": 0.035083705204426656, "acc_norm": 0.4630541871921182, "acc_norm_stderr": 0.035083705204426656 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7424242424242424, "acc_stderr": 0.031156269519646836, "acc_norm": 0.7424242424242424, "acc_norm_stderr": 0.031156269519646836 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306433, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306433 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5769230769230769, "acc_stderr": 0.02504919787604234, "acc_norm": 0.5769230769230769, "acc_norm_stderr": 0.02504919787604234 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.027940457136228402, "acc_norm": 0.3, "acc_norm_stderr": 0.027940457136228402 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6176470588235294, "acc_stderr": 0.031566630992154156, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.031566630992154156 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389024, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.763302752293578, "acc_stderr": 0.0182240781172991, "acc_norm": 0.763302752293578, "acc_norm_stderr": 0.0182240781172991 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4305555555555556, "acc_stderr": 0.03376922151252336, "acc_norm": 0.4305555555555556, "acc_norm_stderr": 0.03376922151252336 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7745098039215687, "acc_stderr": 0.029331162294251735, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.029331162294251735 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808517, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808517 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.648854961832061, "acc_stderr": 0.04186445163013751, "acc_norm": 0.648854961832061, "acc_norm_stderr": 0.04186445163013751 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094632, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094632 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8205128205128205, "acc_stderr": 0.025140935950335445, "acc_norm": 0.8205128205128205, "acc_norm_stderr": 0.025140935950335445 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7803320561941252, "acc_stderr": 0.014805384478371151, "acc_norm": 0.7803320561941252, "acc_norm_stderr": 0.014805384478371151 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6560693641618497, "acc_stderr": 0.025574123786546665, "acc_norm": 0.6560693641618497, "acc_norm_stderr": 0.025574123786546665 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.0142426300705749, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.0142426300705749 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6601307189542484, "acc_stderr": 0.027121956071388856, "acc_norm": 0.6601307189542484, "acc_norm_stderr": 0.027121956071388856 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.662379421221865, "acc_stderr": 0.02685882587948853, "acc_norm": 0.662379421221865, "acc_norm_stderr": 0.02685882587948853 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6759259259259259, "acc_stderr": 0.02604176620271716, "acc_norm": 0.6759259259259259, "acc_norm_stderr": 0.02604176620271716 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.029790719243829707, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.029790719243829707 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4511082138200782, "acc_stderr": 0.012709037347346233, "acc_norm": 0.4511082138200782, "acc_norm_stderr": 0.012709037347346233 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5919117647058824, "acc_stderr": 0.029855261393483924, "acc_norm": 0.5919117647058824, "acc_norm_stderr": 0.029855261393483924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6209150326797386, "acc_stderr": 0.01962744474841223, "acc_norm": 0.6209150326797386, "acc_norm_stderr": 0.01962744474841223 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6857142857142857, "acc_stderr": 0.029719329422417475, "acc_norm": 0.6857142857142857, "acc_norm_stderr": 0.029719329422417475 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7910447761194029, "acc_stderr": 0.028748298931728655, "acc_norm": 0.7910447761194029, "acc_norm_stderr": 0.028748298931728655 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835816, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835816 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7426900584795322, "acc_stderr": 0.03352799844161865, "acc_norm": 0.7426900584795322, "acc_norm_stderr": 0.03352799844161865 }, "harness|truthfulqa:mc|0": { "mc1": 0.2778457772337821, "mc1_stderr": 0.015680929364024637, "mc2": 0.44699355725588724, "mc2_stderr": 0.015255919110214552 }, "harness|winogrande|5": { "acc": 0.7379636937647988, "acc_stderr": 0.012358944431637563 }, "harness|gsm8k|5": { "acc": 0.40333586050037906, "acc_stderr": 0.013512654781814687 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
DeepFoldProtein/foldseek_over70_ss_proteome_processed_1024_ankh_sst8_70_test
--- dataset_info: features: - name: uniprotAccession dtype: string - name: chain_id dtype: string - name: seq dtype: string - name: sst3 dtype: string - name: sst8 dtype: string - name: len dtype: int64 - name: confidenceScore sequence: float64 - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: special_tokens_mask sequence: int8 - name: label sequence: int64 - name: loss_mask sequence: int64 splits: - name: train num_bytes: 1915136 num_examples: 99 download_size: 128542 dataset_size: 1915136 configs: - config_name: default data_files: - split: train path: data/train-* ---
distilled-from-one-sec-cv12/chunk_252
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 896545004 num_examples: 174697 download_size: 915076502 dataset_size: 896545004 --- # Dataset Card for "chunk_252" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Abijith/car-damage-segmentation-small
--- license: apache-2.0 --- # Car damage segmentation sample dataset
Nerfgun3/brush_style
--- language: - en tags: - stable-diffusion - text-to-image license: creativeml-openrail-m inference: false --- # Brush Style Embedding / Textual Inversion ## Usage To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder To use it in a prompt: ```"art by brush_style"``` If it is to strong just add [] around it. Trained until 10000 steps I added a 7.5k steps trained ver in the files aswell. If you want to use that version, remove the ```"-7500"``` from the file name and replace the 10k steps ver in your folder Have fun :) ## Example Pictures <table> <tr> <td><img src=https://i.imgur.com/Mp2F6GR.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/a2Cmqb4.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/YwSafu4.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/fCFSIs5.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/S8v6sXG.png width=100% height=100%/></td> </tr> </table> ## License This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies: 1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content 2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license 3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) [Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license)
abdoelsayed/Open-ArabicaQA
--- annotations_creators: - crowdsourced language_creators: - crowdsourced - found license: mit task_categories: - question-answering language: - ar pretty_name: abdoelsayed/Open-ArabicaQA size_categories: - 10K<n<100K --- # ArabicaQA ArabicaQA: Comprehensive Dataset for Arabic Question Answering This repository contains dataset for paper *ArabicaQA: Comprehensive Dataset for Arabic Question Answering*. Below, we provide details regarding the materials available in this repository: ArabicaQA is a robust dataset designed to support and advance the development of Arabic Question Answering (QA) systems. This dataset encompasses a wide range of question types, including both Machine Reading Comprehension (MRC) and Open-Domain questions, catering to various aspects of QA research and application. The dataset is structured to facilitate training, validation, and testing of Arabic QA models. For more detail https://github.com/DataScienceUIBK/ArabicaQA/tree/main ## Dataset Within this folder, you will find the training, validation, and test sets of the ArabicaQA dataset. Refer to the table below for the dataset statistics: | | Training | Validation | Test | | -------------------|----------|------------|--------| | MRC (with answers) | 62,186 | 13,483 | 13,426 | | MRC (unanswerable) | 2,596 | 561 | 544 | | Open-Domain | 62,057 | 13,475 | 13,414 | | Open-Domain | 58,528 | 12,541 | 12,541 | ## Citation If you find these codes or data useful, please consider citing our paper as: ``` @misc{abdallah2024arabicaqa, title={ArabicaQA: A Comprehensive Dataset for Arabic Question Answering}, author={Abdelrahman Abdallah and Mahmoud Kasem and Mahmoud Abdalla and Mohamed Mahmoud and Mohamed Elkasaby and Yasser Elbendary and Adam Jatowt}, year={2024}, eprint={2403.17848}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
Santosh-Gupta/EncephalitisParagraphEmbeddings
--- license: mit dataset_info: features: - name: paragraph_embeddings sequence: float32 splits: - name: train num_bytes: 1578707784 num_examples: 513234 download_size: 1897946417 dataset_size: 1578707784 configs: - config_name: default data_files: - split: train path: data/train-* ---
otavinshow/karlvoz
--- license: openrail ---
jlbaker361/addition_decimal
--- dataset_info: features: - name: input dtype: string - name: output dtype: float64 - name: text dtype: string splits: - name: train num_bytes: 2145709.8 num_examples: 29376 - name: test num_bytes: 238412.2 num_examples: 3264 download_size: 884683 dataset_size: 2384122.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* --- # Dataset Card for "addition_decimal" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
SaffalPoosh/DM-vton-test
--- configs: - config_name: default data_files: - split: test path: data/test-* dataset_info: features: - name: person_images dtype: image - name: cloth_images dtype: image - name: cloth_edge_images dtype: image splits: - name: test num_bytes: 6878828.0 num_examples: 416 download_size: 6469560 dataset_size: 6878828.0 --- # Dataset Card for "DM-vton" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
thomasgauthier/observation_or_evaluation
--- task_categories: - text-classification tags: - Synthetic - Nonviolent communication - Empathetic understanding language: - en multilinguality: - monolingual language_creators: - book - tv_script annotations_creators: - machine-generated size_categories: - 1K<n<10K license: - apache-2.0 --- # Dataset Card for "Observation or evaluation" ## Dataset Description - **Homepage:** [Blog post](https://thomasgauthier.dev/devlog/nvc) - **Repository:** [Github](https://github.com/thomasgauthier/observation_or_evaluation) ### Dataset Summary This dataset contains statements classified into observations and evaluations categories, based on the principles of Nonviolent Communication (NVC) teached by Marshall Rosenberg. It includes a synthetic dataset generated and augmented through various language models to classify statements reflecting either pure observations (noticing) or evaluations (judgments), aimed at understanding and practicing effective empathetic communication. The dataset is constructed to evaluate generalist large language models abilities at distinguishing between observational and evaluative sentences as defined in NVC, serving as a benchmark for sentiment analysis and subjective interpretation accuracy. ### Supported Tasks and Leaderboards # - `text-classification`: This task involves classifying sentences into one of the two categories : statements containing observations and statements containing evaluations. This discerning ability can help in understanding and improving empathetic communication skills. There is no active leaderboard for this task but this dataset can be used as one. ### Languages The dataset is entirely in English. ## Dataset Structure ### Data Instances A typical instance in this dataset might look like: ```json { "statement": "John was angry with me yesterday for no reason.", "reasoning": "The statement suggests that John felt a certain emotion (anger) directed towards the speaker and adds the qualifier 'for no reason'. This is considered an evaluation.", "classification": "Evaluation", "pure_observation_alternative": "John told me he was angry." } ``` ### Data Fields - `statement`: The original sentence provided in the dataset. - `reasoning`: The rationale behind classifying the statement as an observation, evaluation, or mixed. - `classification`: The classification of the statement - `Observation`, `Evaluation`, or `Mixed`. - `pure_observation_alternative`?: An optional alternative version of the statement that represents a pure observation without evaluation. ### Data Splits # The dataset is incorporated into a `test` split, intended not for training purposes, but rather as a benchmark to evaluate generalist models. ## Additional files This repo also includes - [`observation_or_evaluation.ipynb`](observation_or_evaluation.ipynb): The complete code for generating, filtering and refining the dataset - [`generations.csv`](generations.csv): A CSV file with all the prompts and generations (with generation parameters) sent and received from Together.ai - [`results.sqlite`](results.sqlite): The sqlite file where everything was saved (see notebook) ## Dataset Creation ### Curation Rationale # The dataset was created to provide a metric to gauge language models abilities at Nonviolent Communication (NVC), specifically the differentiation between observations and evaluations, which is a core concept in NVC. ### Source Data All samples in this dataset were generated by large language models. The bulk of the data was inspired by an exercise in Marshall Rosenberg's book *Nonviolent Communication: A Language of Life*. It was further augmented with TV script (*Seinfeld*) seed data to ensure varied and relatable statements. ### Annotations Annotations were generated by language model outputs, with subsequent manual and automated review and adjustment to ensure quality and adherence to NVC principles. ## Considerations for Using the Data ### Social Impact of Dataset # This dataset aims to contribute positively to the development of AI systems capable of understanding and practicing principles of empathetic and nonviolent communication, potentially reducing misunderstandings and conflicts in human interactions. ### Discussion of Biases # Given the synthetic nature of part of the dataset, there most certainly are biases in the language models' training data that could affect the classifications. ### Other Known Limitations # The synthetic generation of data points may not capture the full complexity and nuance of human emotional expression and interpretation. Furthermore, the classifications contained in the dataset have not been reviewed by NVC practitioners and could fail to properly reflect NVC principles. ## Additional Information ### Dataset Curators The dataset generation pipeline was developped by Thomas Gauthier-Caron ### Licensing Information The dataset is distributed under a Apache 2.0 license. ### Citation Information ``` @misc{observation_or_evaluation_dataset_2024, author = {Gauthier-Caron, Thomas}, title = {Observation or evaluation dataset}, year = {2024}, howpublished = {\url{https://thomasgauthier.dev/devlog/nvc}} } ``` ### Contributions Special thanks to Marshall B. Rosenberg for the foundational work on Nonviolent Communication. Additional thanks to Mistral, Hugging Face, Together.AI and Nous Research for the AI models and inference services that enabled this work.
ChanceFocus/m2sum
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: id dtype: string - name: query dtype: string - name: answer dtype: string - name: text dtype: string splits: - name: train num_bytes: 10278 num_examples: 1 - name: test num_bytes: 4679014 num_examples: 200 download_size: 0 dataset_size: 4689292 --- # Dataset Card for "m2sum" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/ringo_touhou
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of ringo/鈴瑚/링고 (Touhou) This is the dataset of ringo/鈴瑚/링고 (Touhou), containing 500 images and their tags. The core tags of this character are `animal_ears, rabbit_ears, blonde_hair, short_hair, hat, floppy_ears, red_eyes, flat_cap, brown_headwear, breasts, cabbie_hat`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 437.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ringo_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 296.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ringo_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1096 | 601.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ringo_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 406.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ringo_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1096 | 782.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ringo_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/ringo_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 23 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, dango, orange_shirt, solo, midriff, short_sleeves, shorts, skewer, eating, looking_at_viewer, navel, barefoot, smile | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, dango, holding_food, orange_shirt, short_sleeves, solo, striped_shorts, yellow_shorts, closed_mouth, midriff, navel, simple_background, eating, white_background, bangs, vertical_stripes, :t, barefoot, blush_stickers, frills, full_body, medium_breasts, crop_top, one-hour_drawing_challenge, yellow_shirt | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, orange_shirt, solo, upper_body, open_mouth, short_sleeves, looking_at_viewer, simple_background, bangs, smile, collarbone | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dango | orange_shirt | solo | midriff | short_sleeves | shorts | skewer | eating | looking_at_viewer | navel | barefoot | smile | holding_food | striped_shorts | yellow_shorts | closed_mouth | simple_background | white_background | bangs | vertical_stripes | :t | blush_stickers | frills | full_body | medium_breasts | crop_top | one-hour_drawing_challenge | yellow_shirt | upper_body | open_mouth | collarbone | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------------|:-------|:----------|:----------------|:---------|:---------|:---------|:--------------------|:--------|:-----------|:--------|:---------------|:-----------------|:----------------|:---------------|:--------------------|:-------------------|:--------|:-------------------|:-----|:-----------------|:---------|:------------|:-----------------|:-----------|:-----------------------------|:---------------|:-------------|:-------------|:-------------| | 0 | 23 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | | X | | | | X | | | X | | | | | X | | X | | | | | | | | | | X | X | X |
23nx7ng/cdial-bias
--- license: cc-by-nc-nd-4.0 --- Data by Zhou Jingyan, et al. (2022) and copied from GitHub with permission. Original repo: https://github.com/para-zhou/CDial-Bias/.
daydreamer-json/temporalHlsRawDataStorage
--- viewer: false ---
coralexbadea/monitorul_trial_qa400
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 945166 num_examples: 3181 download_size: 441212 dataset_size: 945166 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "monitorul_trial_qa400" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
justinian336/salvadoran-news-ner
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 73983057.36422747 num_examples: 56025 download_size: 43634286 dataset_size: 73983057.36422747 --- # Dataset Card for "salvadoran-news-ner" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/nekone_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of nekone/ネコネ/猫音 (Azur Lane) This is the dataset of nekone/ネコネ/猫音 (Azur Lane), containing 46 images and their tags. The core tags of this character are `animal_ears, red_eyes, brown_hair, tail, long_hair, black_hair, mole_under_eye, mole, ponytail, ribbon`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 46 | 47.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nekone_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 46 | 31.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nekone_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 107 | 67.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nekone_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 46 | 43.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nekone_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 107 | 87.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nekone_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/nekone_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, blush, long_sleeves, simple_background, white_background, holding, dress, bangs, hair_ribbon, standing, twintails | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | long_sleeves | simple_background | white_background | holding | dress | bangs | hair_ribbon | standing | twintails | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:---------------|:--------------------|:-------------------|:----------|:--------|:--------|:--------------|:-----------|:------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
Vinnyyw/AnahiSA
--- license: openrail ---