datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Upretiramesh/SyntheticECGData | ---
license: apache-2.0
---
|
liuyanchen1015/VALUE_stsb_negative_inversion | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 246
num_examples: 1
- name: train
num_bytes: 281
num_examples: 2
download_size: 6194
dataset_size: 527
---
# Dataset Card for "VALUE_stsb_negative_inversion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cognitive-Lab/Indic-Hellaswag | ---
configs:
- config_name: kn
data_files:
- split: test
path: kn/test.json
- split: validation
path: kn/validation.json
- config_name: hi
data_files:
- split: test
path: hi/test.json
- split: validation
path: hi/validation.json
- config_name: ta
data_files:
- split: test
path: ta/test.json
- split: validation
path: ta/validation.json
- config_name: te
data_files:
- split: test
path: te/test.json
- split: validation
path: te/validation.json
- config_name: ml
data_files:
- split: test
path: ml/test.json
- split: validation
path: ml/validation.json
- config_name: gu
data_files:
- split: test
path: gu/test.json
- split: validation
path: gu/validation.json
- config_name: mr
data_files:
- split: test
path: mr/test.json
- split: validation
path: mr/validation.json
---
# Hellaswag Translated
Citation:
```
@inproceedings{zellers2019hellaswag,
title={HellaSwag: Can a Machine Really Finish Your Sentence?},
author={Zellers, Rowan and Holtzman, Ari and Bisk, Yonatan and Farhadi, Ali and Choi, Yejin},
booktitle ={Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics},
year={2019}
}
```
Contributions:\
Thanks to [@Srinidhi9113](https://huggingface.co/Srinidhi9113) for adding the dataset. |
youndukn/dpo_10 | ---
license: apache-2.0
---
|
Bylaw/bp-template-classification | ---
dataset_info:
features:
- name: pixel_values
sequence:
sequence:
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 2058820896
num_examples: 3404
download_size: 129763033
dataset_size: 2058820896
---
# Dataset Card for "bp-template-classification-bp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Back-up/chung-khoan-demo-15 | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 324
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
factored/test_fr_explorer_dataset | ---
dataset_info:
features:
- name: labels
dtype:
class_label:
names:
'0': Data Engineer
'1': Machine Learning Engineer
'2': Data Analyst
'3': Senior Data Engineer
'4': Data Scientist
'5': BI Engineer
'6': Senior Business Intelligence Developer
'7': Full Stack Developer
'8': DevOps Engineer
'9': CTO and Back-End Developer
'10': Software Engineer
'11': BI Analyst
'12': Data Visualization and Business Intelligence Analyst
'13': Innovation and Software Development Manager
'14': Information Governance Analyst
'15': Operations Specialist
'16': Senior Machine Learning Engineer
'17': Data Scientist and Sr. Commercial Intelligence Analyst
'18': Business Intelligence Developer
'19': Pricing Analyst
'20': Senior Big Data Engineer
'21': Analytics Engineer
'22': Operations Maintenance Engineer
'23': Freelance Web Developer
'24': Data Ops/Data Engineer
'25': Master’s Course Designer
'26': Engineer
'27': Logistics Analyst
'28': Software Developer Engineer
'29': Senior Tutor
'30': Data Engineer & Architect
'31': IT Analyst
'32': Freelance Computer Vision Engineer
'33': Undergraduate Researcher
'34': Analytics Consultant
'35': Senior Business Intelligence
'36': Junior Backend Developer
'37': Junior Business Intelligence Consultant
'38': Senior Platform Engineer
'39': Senior Back-End Developer
'40': Teaching Assistant
'41': Data Scientist Research
'42': Sr. Software Data Engineer
'43': Industrial Improvement Intern
'44': Data Science Analyst
'45': Statistical Analyst
'46': Industry and Commerce Analyst
'47': Machine Learning Trainee
'48': Final Thesis Project
'49': Product Manager
'50': Software Developer
- name: masked_text
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 42827.0
num_examples: 120
download_size: 32966
dataset_size: 42827.0
---
# Dataset Card for "test_fr_explorer_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aranemini/RoboVox | ---
license: cc-by-4.0
---
RoboVox is a Far-filed speaker recognition dataset. |
Phando/vqa_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: image_id
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 67692137168.704
num_examples: 443757
- name: validation
num_bytes: 33693404566.41
num_examples: 214354
- name: test
num_bytes: 70169720510.0
num_examples: 447793
download_size: 34818002031
dataset_size: 171555262245.114
---
# Dataset Card for "vqa_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/alpaca-intel-data-dpo | ---
license: mit
---
|
income/cqadupstack-gaming-top-20-gen-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
---
# NFCorpus: 20 generated queries (BEIR Benchmark)
This HF dataset contains the top-20 synthetic queries generated for each passage in the above BEIR benchmark dataset.
- DocT5query model used: [BeIR/query-gen-msmarco-t5-base-v1](https://huggingface.co/BeIR/query-gen-msmarco-t5-base-v1)
- id (str): unique document id in NFCorpus in the BEIR benchmark (`corpus.jsonl`).
- Questions generated: 20
- Code used for generation: [evaluate_anserini_docT5query_parallel.py](https://github.com/beir-cellar/beir/blob/main/examples/retrieval/evaluation/sparse/evaluate_anserini_docT5query_parallel.py)
Below contains the old dataset card for the BEIR benchmark.
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.Top-20 generated queries for every passage in NFCorpus
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
Uilham/CharlinhaV2 | ---
license: openrail
---
|
nz/closest_to_7000_range_1000_to_9000 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 3782152.1113243764
num_examples: 10000
- name: test
num_bytes: 378215.21113243763
num_examples: 1000
download_size: 2169493
dataset_size: 4160367.322456814
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
SFKs/ddd.com | ---
license: openrail
language:
- ae
task_categories:
- text-classification
---
<div align="center">
<article style="display: flex; flex-direction: column; align-items: center; justify-content: center;">
<p align="center"><img width="300" src="https://user-images.githubusercontent.com/25022954/209616423-9ab056be-5d62-4eeb-b91d-3b20f64cfcf8.svg" /></p>
<h1 style="width: 100%; text-align: center;"></h1>
</p>
</article>
<a href="./README_zh-CN.md" >简体中文</a> | English
</div>
## Introduction
LabelU is an open source data annotation tool that supports Chinese. At present, it has image annotation capabilities such as rectangle, polygon, point, line, classification, and caption. It can support detection, classification, segmentation, text transcription, Line detection, key point detection and other computer vision task scenarios. You can customize the annotation task by freely combining tools, and support COCO and MASK format data export.
## Getting started
### Install locally with miniconda
```
# Download and Install miniconda
# https://docs.conda.io/en/latest/miniconda.html
# Create virtual environment(python = 3.7)
conda create -n labelu python=3.7
# Activate virtual environment
conda activate labelu
# Install labelu
pip install labelu
# Start labelu, server: http://localhost:8000
labelu
```
### Install for local development
```
# Download and Install miniconda
# https://docs.conda.io/en/latest/miniconda.html
# Create virtual environment(python = 3.7)
conda create -n labelu python=3.7
# Activate virtual environment
conda activate labelu
# Install peotry
# https://python-poetry.org/docs/#installing-with-the-official-installer
# Install all package dependencies
poetry install
# Start labelu, server: http://localhost:8000
uvicorn labelu.main:app --reload
# Update submodule
git submodule update --remote --merge
```
## feature
- Uniform, Six image annotation tools are provided, which can be configured through simple visualization or Yaml
- Unlimited, Multiple tools can be freely combined to meet most image annotation requirements
<p align="center">
<img style="width: 600px" src="https://user-images.githubusercontent.com/25022954/209318236-79d3a5c3-2700-46c3-b59a-62d9c132a6c3.gif">
</p>
- Universal, Support multiple data export formats, including LabelU, COCO, Mask
## Scenes
### Computer Vision
- Detection: Detection scenes for vehicles, license plates, pedestrians, faces, industrial parts, etc.
- Classification: Detection of object classification, target characteristics, right and wrong judgments, and other classification scenarios
- Semantic segmentation: Human body segmentation, panoramic segmentation, drivable area segmentation, vehicle segmentation, etc.
- Text transcription: Text detection and recognition of license plates, invoices, insurance policies, signs, etc.
- Contour detection: positioning line scenes such as human contour lines, lane lines, etc.
- Key point detection: positioning scenes such as human face key points, vehicle key points, road edge key points, etc.
## Usage
- [Guide](./docs/GUIDE.md)
## Annotation Format
- [LabelU Annotation Format](./docs/annotation%20format/README.md)
## Communication
Welcome to the Opendatalab Wechat group!
<p align="center">
<img style="width: 400px" src="https://user-images.githubusercontent.com/25022954/208374419-2dffb701-321a-4091-944d-5d913de79a15.jpg">
</p>
## Links
- [labelU-Kit](https://github.com/opendatalab/labelU-Kit)(Powered by labelU-Kit)
## LICENSE
This project is released under the [Apache 2.0 license](./LICENSE). |
Nexdata/155_People_Malay_Speech_Data_by_Mobile_Phone_Guiding | ---
license: cc-by-nc-nd-4.0
---
## Description
155 Malaysian local speakers participated in the recording and the recoring environment is quiet. The recordings contain various categories like in-car scene, home, speech assistant. 50 sentences for each speaker. The valid time is 7 hours. All texts are manually transcribed with high accuracy.
For more details, please refer to the link: https://www.nexdata.ai/dataset/172?source=Huggingface
## Format
16kHz, 16bit, uncompressed wav, mono channel
## Recording environment
quiet indoor environment, without echo
## Recording content (read speech)
smart car; smart home; speech assistant
## Speaker
155 Malays, 66% of which are female
## Device
Android mobile phone, iPhone
## Language
Malay
## Transcription content
text, 4 special identifiers
## Accuracy rate
95% (the accuracy rate of identifiers is not included)
## Application scenarios
speech recognition, voiceprint recognition
# Licensing Information
Commercial License
|
ge0rgeli/CMU_LTI | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 12024787
num_examples: 12937
download_size: 4390877
dataset_size: 12024787
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kalcho100/flippy_final1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1042126622.5634332
num_examples: 763081
- name: test
num_bytes: 115792150.43656677
num_examples: 84787
download_size: 623847339
dataset_size: 1157918773.0
---
# Dataset Card for "flippy_final1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
furry-br/brota | ---
license: openrail
---
|
nilq/baby-python-and-tiny-stories | ---
dataset_info:
features:
- name: content
dtype: string
- name: origin
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 3405114404
num_examples: 12815477
- name: validation
num_bytes: 158270211
num_examples: 1059278
download_size: 1611741279
dataset_size: 3563384615
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
emmadrex/emma_ncipated_ietg_ioas_001 | ---
license: cc-by-nc-sa-4.0
pretty_name: 'Isometric Equilateral Triangle Grid: Impossible Objects and Structures'
task_categories:
- summarization
--- |
BUT-FIT/BUT-LCC | ---
task_categories:
- text-generation
- fill-mask
language:
- cs
pretty_name: BUT-LCC
size_categories:
- 10B<n<100B
extra_gated_prompt: "By completing the form below, you acknowledge that the provided data is offered as is. Although we anticipate no problems, you accept full responsibility for any repercussions resulting from the use of this data. Furthermore, you agree that the data must not be utilized for malicious or harmful purposes towards humanity."
extra_gated_fields:
Name: text
Email: text
Affiliation: text
Country: text
Usecase: text
I have explicitly check with my jurisdiction and I confirm that downloading BUT-LCC is legal in the country/region where I am located right now, and for the use case that I have described above: checkbox
You agree to not attempt to determine the identity of individuals in this dataset: checkbox
---
# BUT-LCC Corpus
BUT-LCC (Brno University of Technology Large Czech Collection) is a corpus of Czech texts. It was cleaned using exact deduplication, fuzzy deduplication (using minhashlsh), n-gram language model, and an SVM classifier that filters inappropriate content (we manually labelled).
## Data Sources
<table>
<thead>
<tr>
<th>Part</th>
<th>GB of text</th>
<th>GB of titles</th>
<th>%</th>
</tr>
</thead>
<tbody>
<tr>
<td>CulturaX</td>
<td>157.79</td>
<td>3.85</td>
<td>49</td>
</tr>
<tr>
<td>TenTen-cs-2017</td>
<td>48.97</td>
<td>0.95</td>
<td>15</td>
</tr>
<tr>
<td>BUT_Crawl</td>
<td>25.15</td>
<td>0.8</td>
<td>8</td>
</tr>
<tr>
<td>cswiki-20230101</td>
<td>1.05</td>
<td>0.01</td>
<td>0</td>
</tr>
<tr>
<td>historical</td>
<td>13.47</td>
<td>0.00</td>
<td>4</td>
</tr>
<tr>
<td>hplt</td>
<td>65.55</td>
<td>3.20</td>
<td>21</td>
</tr>
<tr>
<td>idnes_comments</td>
<td>7.38</td>
<td>0.03</td>
<td>2</td>
</tr>
</tbody>
<tfoot>
<tr>
<td><b>Sum</b></td>
<td><b>319.36</b></td>
<td><b>8.84</b></td>
<td></td>
</tr>
</tfoot>
</table>
## Format
The corpus consists of train and test splits. It uses jsonl format, which means that every sample is JSON on its own line.
### Sample Format
```json
{
"id": unique identifier,
"part": original source,
"title": source document title,
"text": the context
}
```
# License Information
- We do not own any of the text from which these text data has been extracted.
- We license the actual packaging of these text data under the Creative Commons CC0 license ("no rights reserved").
Detailed licensing information for contained corpora (not crawled by us) is below.
| Corpus | Licensing Information|
|-----------------|----------------|
| CulturaX | [uonlp/CulturaX](https://huggingface.co/datasets/uonlp/CulturaX#license-information) |
| TenTen-cs-2017 | [NLP Centre Web Corpus License Agreement](https://lindat.mff.cuni.cz/repository/xmlui/handle/11234/1-4835) |
| Czech Wikipedia | [CC BY-SA 4.0 DEED](https://creativecommons.org/licenses/by-sa/4.0/deed.en) |
| Historical | OCR'd documents since 1850, publicly available from the [Czech Digital Library](https://www.digitalniknihovna.cz/) |
| HPLT | [https://hplt-project.org/datasets/v1.2](https://hplt-project.org/datasets/v1.2) |
## Our Models Linked to This Dataset
- [BUT-FIT/CSMPT7B](https://huggingface.co/BUT-FIT/csmpt7b)
- [BUT-FIT/CSTinyLlama-1.2B](https://huggingface.co/BUT-FIT/CSTinyLlama-1.2B)
- [BUT-FIT/Czech-GPT-2-XL-133k](https://huggingface.co/BUT-FIT/Czech-GPT-2-XL-133k)
## Statistics
<table>
<thead>
<tr>
<th>Split</th>
<th>Samples</th>
</tr>
</thead>
<tbody>
<tr>
<td>Train</td>
<td>176 780 582</td>
</tr>
<tr>
<td>Test</td>
<td>20 000</td>
</tr>
</tbody>
</table>
## ID 2 URL mapping
If you need to recover original webpages, we provide ID to source URL mapping where possible in id2url.csv file.
# Acknowledgement
This work was supported by NAKI III program of Ministry of Culture Czech Republic, project semANT ---
"Sémantický průzkumník textového kulturního dědictví" grant no. `DH23P03OVV060` and
by the Ministry of Education, Youth and Sports of the Czech Republic through the e-INFRA CZ (ID:`90254`).
# Contributors
- [Jan Doležal](https://www.fit.vut.cz/person/idolezal/.en) developed cleaning pipeline for text processing, collected data for cleaning, and analyzed cutoff threshold for pruning.
- [Martin Dočkal](https://www.fit.vut.cz/person/idocekal/.en) uploaded data to Huggingface, and helped with cutoff analysis.
- [Martin Fajčík](https://mfajcik.github.io/) reviewed existing corpora, planned pipeline steps, processed data for LM training, and verified their usefullness.
- [Martin Kišš](https://www.fit.vut.cz/person/ikiss/.en) downloaded historical documents, and ran our PeroOCR on the collection.
- [Karel Beneš](https://www.fit.vut.cz/person/ibenes/.en) performed cleaning of historical documents, and created n-gram lm for document filtering.
- [Karel Ondřej](https://www.fit.vut.cz/person/ondrej/.en) who wrote a crawler for collecting BUT_Crawl and prepared preliminary clean corpus version.
- [Michal Hradiš](https://www.fit.vut.cz/person/ihradis/.en) managed the work, and pushed the members when necessary.
|
anhnv125/ud_alpaca | ---
dataset_info:
- config_name: be_hse
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 66768815
num_examples: 21555
- name: validation
num_bytes: 3370351
num_examples: 1090
- name: test
num_bytes: 2873580
num_examples: 889
download_size: 5480853
dataset_size: 73012746
- config_name: bxr_bdt
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 56167
num_examples: 19
- name: test
num_bytes: 2821495
num_examples: 908
download_size: 228304
dataset_size: 2877662
- config_name: cs_pdt
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 216399245
num_examples: 68495
- name: validation
num_bytes: 29301204
num_examples: 9270
- name: test
num_bytes: 32048085
num_examples: 10148
download_size: 25707376
dataset_size: 277748534
- config_name: de_gsd
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 44307017
num_examples: 13814
- name: validation
num_bytes: 2477610
num_examples: 799
- name: test
num_bytes: 3070360
num_examples: 977
download_size: 4999156
dataset_size: 49854987
- config_name: en_ewt
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 38805886
num_examples: 12543
- name: validation
num_bytes: 6000641
num_examples: 2002
- name: test
num_bytes: 6198885
num_examples: 2077
download_size: 3810046
dataset_size: 51005412
- config_name: es_ancora
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 49943950
num_examples: 14305
- name: validation
num_bytes: 5798461
num_examples: 1654
- name: test
num_bytes: 5985191
num_examples: 1721
download_size: 8063762
dataset_size: 61727602
- config_name: fr_gsd
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 48157929
num_examples: 14449
- name: validation
num_bytes: 4906593
num_examples: 1476
- name: test
num_bytes: 1378398
num_examples: 416
download_size: 6341149
dataset_size: 54442920
- config_name: hsb_ufal
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 74433
num_examples: 23
- name: test
num_bytes: 1963315
num_examples: 623
download_size: 218777
dataset_size: 2037748
- config_name: kk_ktb
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 102630
num_examples: 31
- name: test
num_bytes: 3176663
num_examples: 1047
download_size: 257360
dataset_size: 3279293
- config_name: lt_hse
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 501163
num_examples: 153
- name: validation
num_bytes: 501163
num_examples: 153
- name: test
num_bytes: 501163
num_examples: 153
download_size: 229455
dataset_size: 1503489
- config_name: ru_syntagrus
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 163096580
num_examples: 48814
- name: validation
num_bytes: 21977495
num_examples: 6584
- name: test
num_bytes: 21691135
num_examples: 6491
download_size: 21623891
dataset_size: 206765210
configs:
- config_name: be_hse
data_files:
- split: train
path: be_hse/train-*
- split: validation
path: be_hse/validation-*
- split: test
path: be_hse/test-*
- config_name: bxr_bdt
data_files:
- split: train
path: bxr_bdt/train-*
- split: test
path: bxr_bdt/test-*
- config_name: cs_pdt
data_files:
- split: train
path: cs_pdt/train-*
- split: validation
path: cs_pdt/validation-*
- split: test
path: cs_pdt/test-*
- config_name: de_gsd
data_files:
- split: train
path: de_gsd/train-*
- split: validation
path: de_gsd/validation-*
- split: test
path: de_gsd/test-*
- config_name: en_ewt
data_files:
- split: train
path: en_ewt/train-*
- split: validation
path: en_ewt/validation-*
- split: test
path: en_ewt/test-*
- config_name: es_ancora
data_files:
- split: train
path: es_ancora/train-*
- split: validation
path: es_ancora/validation-*
- split: test
path: es_ancora/test-*
- config_name: fr_gsd
data_files:
- split: train
path: fr_gsd/train-*
- split: validation
path: fr_gsd/validation-*
- split: test
path: fr_gsd/test-*
- config_name: hsb_ufal
data_files:
- split: train
path: hsb_ufal/train-*
- split: test
path: hsb_ufal/test-*
- config_name: kk_ktb
data_files:
- split: train
path: kk_ktb/train-*
- split: test
path: kk_ktb/test-*
- config_name: lt_hse
data_files:
- split: train
path: lt_hse/train-*
- split: validation
path: lt_hse/validation-*
- split: test
path: lt_hse/test-*
- config_name: ru_syntagrus
data_files:
- split: train
path: ru_syntagrus/train-*
- split: validation
path: ru_syntagrus/validation-*
- split: test
path: ru_syntagrus/test-*
---
|
acidcoma/ru_librispeech_for_speaker_separation | ---
license: cc-by-sa-4.0
---
Dataset for source audio separation task based on Russian LibriSpeech (RuLS) dataset. Dataset contains 50 000 audio mixtures with 2 speakers for train part; 12500 audio mixtures for test part.
Dataset also containts metadata files with audio duration (sec), source 1 and source 2 filepaths for each audio mixture.
source: https://www.openslr.org/96/ |
sean0042/KorMedMCQA | ---
configs:
- config_name: doctor
data_files:
- split: train
path: data/doctor-train.csv
- split: dev
path: data/doctor-dev.csv
- split: test
path: data/doctor-test.csv
- config_name: nurse
data_files:
- split: train
path: data/nurse-train.csv
- split: dev
path: data/nurse-dev.csv
- split: test
path: data/nurse-test.csv
- config_name: pharm
data_files:
- split: train
path: data/pharm-train.csv
- split: dev
path: data/pharm-dev.csv
- split: test
path: data/pharm-test.csv
license: cc-by-nc-2.0
task_categories:
- question-answering
language:
- ko
tags:
- medical
size_categories:
- 10K<n<100K
---
# KorMedMCQA : Multi-Choice Question Answering Benchmark for Korean Healthcare Professional Licensing Examinations
We introduce KorMedMCQA, the first Korean multiple-choice question answering (MCQA) benchmark derived from Korean healthcare professional licensing examinations, covering from the year 2012 to year 2023.
This dataset consists of a selection of questions from the license examinations for doctors, nurses, and pharmacists, featuring a diverse array of subjects.
We conduct baseline experiments on various large language models, including proprietary/open-source, multilingual/Korean-additional pretrained, and clinical context pretrained models, highlighting the potential for further enhancements.
We make our data publicly available on HuggingFace and provide a evaluation script via LM-Harness, inviting further exploration and advancement in Korean healthcare environments.
Paper : https://arxiv.org/abs/2403.01469
## Dataset Details
### Languages
Korean
### Subtask
```
from datasets import load_dataset
doctor = load_dataset(path = "sean0042/KorMedMCQA",name = "doctor")
nurse = load_dataset(path = "sean0042/KorMedMCQA",name = "nurse")
pharmacist = load_dataset(path = "sean0042/KorMedMCQA",name = "pharm")
```
### Statistics
| Category | # Questions (Train/Dev/Test) |
|------------------------------|------------------------------|
| Doctor | 2,339 (1,890/164/285) |
| Nurse | 1,460 (582/291/587) |
| Pharmacist | 1,546 (632/300/614) |
### Data Fields
- `subject`: doctor, nurse, or pharm
- `year`: year of the examination
- `period`: period of the examination
- `q_number`: question number of the examination
- `question`: question
- `A`: First answer choice
- `B`: Second answer choice
- `C`: Third answer choice
- `D`: Fourth answer choice
- `E`: Fifth answer choice
- `answer` : Answer (1 to 5). 1 denotes answer A, and 5 denotes answer E
## Contact
```
sean0042@kaist.ac.kr
``` |
316usman/thematic2e_rr | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 110744216.31788611
num_examples: 174242
download_size: 42629039
dataset_size: 110744216.31788611
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
presencesw/QAK_raw_test | ---
dataset_info:
features:
- name: uuid
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 42306
num_examples: 10
download_size: 40972
dataset_size: 42306
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Aniketnew7/check_new | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 294342
num_examples: 711
download_size: 114023
dataset_size: 294342
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Yamei/TVCG_Papers | ---
dataset_info:
features:
- name: data
struct:
- name: issue
struct:
- name: id
dtype: string
- name: title
dtype: string
- name: year
dtype: string
- name: issueNum
dtype: string
- name: idPrefix
dtype: string
- name: pubType
dtype: string
- name: volume
dtype: string
- name: label
dtype: string
- name: downloadables
struct:
- name: hasCover
dtype: bool
- name: __typename
dtype: string
- name: __typename
dtype: string
- name: article
struct:
- name: id
dtype: string
- name: doi
dtype: string
- name: abstract
dtype: string
- name: abstracts
list:
- name: abstractType
dtype: string
- name: content
dtype: string
- name: __typename
dtype: string
- name: normalizedAbstract
dtype: string
- name: title
dtype: string
- name: normalizedTitle
dtype: string
- name: fno
dtype: string
- name: hasPdf
dtype: bool
- name: idPrefix
dtype: string
- name: keywords
list: string
- name: authors
list:
- name: givenName
dtype: string
- name: surname
dtype: string
- name: fullName
dtype: string
- name: affiliation
dtype: string
- name: __typename
dtype: string
- name: replicability
struct:
- name: isEnabled
dtype: bool
- name: codeDownloadUrl
dtype: string
- name: codeRepositoryUrl
dtype: string
- name: __typename
dtype: string
- name: showBuyMe
dtype: bool
- name: showRecommendedArticles
dtype: bool
- name: isOpenAccess
dtype: bool
- name: issueNum
dtype: string
- name: pubDate
dtype: string
- name: pubType
dtype: string
- name: pages
dtype: string
- name: year
dtype: string
- name: issn
dtype: string
- name: isbn
dtype: string
- name: notes
dtype: string
- name: notesType
dtype: string
- name: __typename
dtype: string
- name: recommendedArticles
list:
- name: id
dtype: string
- name: title
dtype: string
- name: doi
dtype: string
- name: abstractUrl
dtype: string
- name: parentPublication
struct:
- name: id
dtype: string
- name: title
dtype: string
- name: __typename
dtype: string
- name: __typename
dtype: string
- name: adjacentArticles
struct:
- name: previous
struct:
- name: fno
dtype: string
- name: articleId
dtype: string
- name: __typename
dtype: string
- name: next
struct:
- name: fno
dtype: string
- name: articleId
dtype: string
- name: __typename
dtype: string
- name: __typename
dtype: string
- name: webExtras
list:
- name: id
dtype: string
- name: name
dtype: string
- name: location
dtype: string
- name: extension
dtype: string
- name: size
dtype: string
- name: __typename
dtype: string
- name: articleVideos
list: string
splits:
- name: train
num_bytes: 39196837
num_examples: 5178
download_size: 15953795
dataset_size: 39196837
tags:
- 'visualization '
- papers
- academic
- text
pretty_name: 'TVCG Papers '
size_categories:
- 1K<n<10K
---
# Dataset Card for "TVCG_Papers"
* Dataset Description
Repository:
Paper:
Leaderboard:
Point of Contact:
* Dataset Summary: This dataset contains 5178 papers from IEEE TVCG. It contains multiple raw attributes of each paper, including both meta-data and abstract.
* Dataset Structure: Jsonl file, each paper instance is a json object.
* Data Fields:
```
FEATURE = Features({
'data': Features({
'issue': Features({
'id': Value(dtype='string'),
'title': Value(dtype='string'),
'year': Value(dtype='string'),
'issueNum': Value(dtype='string'),
'idPrefix': Value(dtype='string'),
'pubType': Value(dtype='string'),
'volume': Value(dtype='string'),
'label': Value(dtype='string'),
'downloadables': Features({
'hasCover': Value(dtype='bool'),
'__typename': Value(dtype='string')
}),
'__typename': Value(dtype='string')
}),
'article': Features({
'id': Value(dtype='string'),
'doi': Value(dtype='string'),
'abstract': Value(dtype='string'),
'abstracts': [
{
'abstractType': Value(dtype='string'),
'content': Value(dtype='string'),
'__typename': Value(dtype='string')
}
],
'normalizedAbstract': Value(dtype='string'),
'title': Value(dtype='string'),
'normalizedTitle': Value(dtype='string'),
'fno': Value(dtype='string'),
'hasPdf': Value(dtype='bool'),
'idPrefix': Value(dtype='string'),
'keywords': [
Value(dtype='string')
],
'authors': [
{
'givenName': Value(dtype='string'),
'surname': Value(dtype='string'),
'fullName': Value(dtype='string'),
'affiliation': Value(dtype='string'),
'__typename': Value(dtype='string')
}
],
'replicability': Features({
'isEnabled': Value(dtype='bool'),
'codeDownloadUrl': Value(dtype='string'),
'codeRepositoryUrl': Value(dtype='string'),
'__typename': Value(dtype='string')
}),
'showBuyMe': Value(dtype='bool'),
'showRecommendedArticles': Value(dtype='bool'),
'isOpenAccess': Value(dtype='bool'),
'issueNum':Value(dtype='string'),
'pubDate': Value(dtype='string'),
'pubType': Value(dtype='string'),
'pages': Value(dtype='string'),
'year': Value(dtype='string'),
'issn': Value(dtype='string'),
'isbn': Value(dtype='string'),
'notes': Value(dtype='string'),
'notesType': Value(dtype='string'),
'__typename': Value(dtype='string'),
}),
'recommendedArticles': [
{
'id': Value(dtype='string'),
'title': Value(dtype='string'),
'doi': Value(dtype='string'),
'abstractUrl': Value(dtype='string'),
'parentPublication':
{
'id': Value(dtype='string'),
'title': Value(dtype='string'),
'__typename': Value(dtype='string')
},
'__typename': Value(dtype='string')
},
],
'adjacentArticles': Features({
'previous': {
'fno': Value(dtype='string'),
'articleId': Value(dtype='string'),
'__typename': Value(dtype='string')},
'next': {
'fno': Value(dtype='string'),
'articleId': Value(dtype='string'),
'__typename': Value(dtype='string')
},
'__typename': Value(dtype='string')
}),
'webExtras': [
Features({
'id': Value(dtype='string'),
'name': Value(dtype='string'),
'location': Value(dtype='string'),
'extension': Value(dtype='string'),
'size': Value(dtype='string'),
'__typename': Value(dtype='string')
})
],
'articleVideos': [Value(dtype='string')]
})
})
```
* Source Data: https://www.computer.org/csdl/journal/tg
* Citation Information: Yamei Tu (tu.253@osu.edu)
|
poorguys/TW-Kai_2_MingLiU_all_512 | ---
dataset_info:
features:
- name: char
dtype: string
- name: unicode
dtype: string
- name: images
dtype: image
- name: target_images
dtype: image
- name: stroke
dtype: int32
- name: strokes_sequence
sequence: int32
- name: components
sequence: int32
- name: jyutping
dtype: string
splits:
- name: train
num_bytes: 5010639254.25
num_examples: 74491
- name: test
num_bytes: 64615344.25
num_examples: 1791
download_size: 4161218213
dataset_size: 5075254598.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_ehartford__dolphin-2.2.1-mistral-7b | ---
pretty_name: Evaluation run of ehartford/dolphin-2.2.1-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/dolphin-2.2.1-mistral-7b](https://huggingface.co/ehartford/dolphin-2.2.1-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__dolphin-2.2.1-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T17:18:36.579196](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2.1-mistral-7b/blob/main/results_2023-12-04T17-18-36.579196.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6314567324183159,\n\
\ \"acc_stderr\": 0.032318316802746,\n \"acc_norm\": 0.6352434028495076,\n\
\ \"acc_norm_stderr\": 0.032961647633460475,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5311447373702662,\n\
\ \"mc2_stderr\": 0.015062742496541512\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938167,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104301\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6431985660227046,\n\
\ \"acc_stderr\": 0.004780764443411322,\n \"acc_norm\": 0.8375821549492133,\n\
\ \"acc_norm_stderr\": 0.0036807989505319113\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340354,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340354\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727062,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727062\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267025,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267025\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886783,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886783\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n\
\ \"acc_stderr\": 0.015919557829976044,\n \"acc_norm\": 0.8348623853211009,\n\
\ \"acc_norm_stderr\": 0.015919557829976044\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n\
\ \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569508,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569508\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001506,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\
\ \"acc_stderr\": 0.016232826818678502,\n \"acc_norm\": 0.37988826815642457,\n\
\ \"acc_norm_stderr\": 0.016232826818678502\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4471968709256845,\n\
\ \"acc_stderr\": 0.012698825252435111,\n \"acc_norm\": 0.4471968709256845,\n\
\ \"acc_norm_stderr\": 0.012698825252435111\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5311447373702662,\n\
\ \"mc2_stderr\": 0.015062742496541512\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.01161619821577323\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4806671721000758,\n \
\ \"acc_stderr\": 0.013762185709851354\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/dolphin-2.2.1-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|arc:challenge|25_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|drop|3_2023-11-18T21-53-53.398955.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-18T21-53-53.398955.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|gsm8k|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hellaswag|10_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T21-53-53.398955.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-18-36.579196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-18-36.579196.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- '**/details_harness|winogrande|5_2023-11-18T21-53-53.398955.parquet'
- split: 2023_12_04T17_18_36.579196
path:
- '**/details_harness|winogrande|5_2023-12-04T17-18-36.579196.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T17-18-36.579196.parquet'
- config_name: results
data_files:
- split: 2023_11_18T21_53_53.398955
path:
- results_2023-11-18T21-53-53.398955.parquet
- split: 2023_12_04T17_18_36.579196
path:
- results_2023-12-04T17-18-36.579196.parquet
- split: latest
path:
- results_2023-12-04T17-18-36.579196.parquet
---
# Dataset Card for Evaluation run of ehartford/dolphin-2.2.1-mistral-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/dolphin-2.2.1-mistral-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.2.1-mistral-7b](https://huggingface.co/ehartford/dolphin-2.2.1-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__dolphin-2.2.1-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:18:36.579196](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2.1-mistral-7b/blob/main/results_2023-12-04T17-18-36.579196.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6314567324183159,
"acc_stderr": 0.032318316802746,
"acc_norm": 0.6352434028495076,
"acc_norm_stderr": 0.032961647633460475,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5311447373702662,
"mc2_stderr": 0.015062742496541512
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938167,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104301
},
"harness|hellaswag|10": {
"acc": 0.6431985660227046,
"acc_stderr": 0.004780764443411322,
"acc_norm": 0.8375821549492133,
"acc_norm_stderr": 0.0036807989505319113
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340354,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340354
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601684,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601684
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727062,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727062
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267025,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886783,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886783
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569508,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569508
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001506,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678502,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678502
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4471968709256845,
"acc_stderr": 0.012698825252435111,
"acc_norm": 0.4471968709256845,
"acc_norm_stderr": 0.012698825252435111
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5311447373702662,
"mc2_stderr": 0.015062742496541512
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.01161619821577323
},
"harness|gsm8k|5": {
"acc": 0.4806671721000758,
"acc_stderr": 0.013762185709851354
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Kukedlc__NeuralSynthesis-7b-v0.4-slerp | ---
pretty_name: Evaluation run of Kukedlc/NeuralSynthesis-7b-v0.4-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/NeuralSynthesis-7b-v0.4-slerp](https://huggingface.co/Kukedlc/NeuralSynthesis-7b-v0.4-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__NeuralSynthesis-7b-v0.4-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T21:42:59.138066](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralSynthesis-7b-v0.4-slerp/blob/main/results_2024-04-15T21-42-59.138066.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6496475727493363,\n\
\ \"acc_stderr\": 0.032073410949095264,\n \"acc_norm\": 0.6485178166054084,\n\
\ \"acc_norm_stderr\": 0.03274978269013846,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7807489693040384,\n\
\ \"mc2_stderr\": 0.013673388858932743\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266127,\n\
\ \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136437\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7180840470025891,\n\
\ \"acc_stderr\": 0.004490130691020433,\n \"acc_norm\": 0.8913563035251942,\n\
\ \"acc_norm_stderr\": 0.0031055566317393897\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n\
\ \"acc_stderr\": 0.01655328786311604,\n \"acc_norm\": 0.42905027932960893,\n\
\ \"acc_norm_stderr\": 0.01655328786311604\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7807489693040384,\n\
\ \"mc2_stderr\": 0.013673388858932743\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \
\ \"acc_stderr\": 0.012493927348659629\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/NeuralSynthesis-7b-v0.4-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-42-59.138066.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-42-59.138066.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- '**/details_harness|winogrande|5_2024-04-15T21-42-59.138066.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T21-42-59.138066.parquet'
- config_name: results
data_files:
- split: 2024_04_15T21_42_59.138066
path:
- results_2024-04-15T21-42-59.138066.parquet
- split: latest
path:
- results_2024-04-15T21-42-59.138066.parquet
---
# Dataset Card for Evaluation run of Kukedlc/NeuralSynthesis-7b-v0.4-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/NeuralSynthesis-7b-v0.4-slerp](https://huggingface.co/Kukedlc/NeuralSynthesis-7b-v0.4-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__NeuralSynthesis-7b-v0.4-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T21:42:59.138066](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralSynthesis-7b-v0.4-slerp/blob/main/results_2024-04-15T21-42-59.138066.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6496475727493363,
"acc_stderr": 0.032073410949095264,
"acc_norm": 0.6485178166054084,
"acc_norm_stderr": 0.03274978269013846,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7807489693040384,
"mc2_stderr": 0.013673388858932743
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266127,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136437
},
"harness|hellaswag|10": {
"acc": 0.7180840470025891,
"acc_stderr": 0.004490130691020433,
"acc_norm": 0.8913563035251942,
"acc_norm_stderr": 0.0031055566317393897
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.01655328786311604,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.01655328786311604
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7807489693040384,
"mc2_stderr": 0.013673388858932743
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571764
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.012493927348659629
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Kukedlc__Neural4gsm8k | ---
pretty_name: Evaluation run of Kukedlc/Neural4gsm8k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/Neural4gsm8k](https://huggingface.co/Kukedlc/Neural4gsm8k) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__Neural4gsm8k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T15:02:17.754206](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Neural4gsm8k/blob/main/results_2024-02-18T15-02-17.754206.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538369289060512,\n\
\ \"acc_stderr\": 0.03197232182000714,\n \"acc_norm\": 0.6529137599217497,\n\
\ \"acc_norm_stderr\": 0.03264304838525449,\n \"mc1\": 0.543451652386781,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6965382922735985,\n\
\ \"mc2_stderr\": 0.014784401689917369\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6945392491467577,\n \"acc_stderr\": 0.013460080478002508,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7074287990440151,\n\
\ \"acc_stderr\": 0.004540134005060323,\n \"acc_norm\": 0.8844851623182632,\n\
\ \"acc_norm_stderr\": 0.003189889789404671\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700483,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700483\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n\
\ \"acc_stderr\": 0.01642167050633918,\n \"acc_norm\": 0.40558659217877097,\n\
\ \"acc_norm_stderr\": 0.01642167050633918\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083135,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083135\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.543451652386781,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6965382922735985,\n\
\ \"mc2_stderr\": 0.014784401689917369\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781102\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7338893100833965,\n \
\ \"acc_stderr\": 0.012172750939040319\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/Neural4gsm8k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|arc:challenge|25_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|gsm8k|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hellaswag|10_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-02-17.754206.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T15-02-17.754206.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- '**/details_harness|winogrande|5_2024-02-18T15-02-17.754206.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T15-02-17.754206.parquet'
- config_name: results
data_files:
- split: 2024_02_18T15_02_17.754206
path:
- results_2024-02-18T15-02-17.754206.parquet
- split: latest
path:
- results_2024-02-18T15-02-17.754206.parquet
---
# Dataset Card for Evaluation run of Kukedlc/Neural4gsm8k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/Neural4gsm8k](https://huggingface.co/Kukedlc/Neural4gsm8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__Neural4gsm8k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T15:02:17.754206](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Neural4gsm8k/blob/main/results_2024-02-18T15-02-17.754206.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6538369289060512,
"acc_stderr": 0.03197232182000714,
"acc_norm": 0.6529137599217497,
"acc_norm_stderr": 0.03264304838525449,
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6965382922735985,
"mc2_stderr": 0.014784401689917369
},
"harness|arc:challenge|25": {
"acc": 0.6945392491467577,
"acc_stderr": 0.013460080478002508,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.7074287990440151,
"acc_stderr": 0.004540134005060323,
"acc_norm": 0.8844851623182632,
"acc_norm_stderr": 0.003189889789404671
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700483,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700483
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.01642167050633918,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.01642167050633918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083135,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6965382922735985,
"mc2_stderr": 0.014784401689917369
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781102
},
"harness|gsm8k|5": {
"acc": 0.7338893100833965,
"acc_stderr": 0.012172750939040319
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SpellmanAi/SpellmanAi-50b | ---
license: mit
---
|
bouim/dvoice3 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: duration
dtype: float64
splits:
- name: train
num_bytes: 1459262910.208
num_examples: 2117
- name: test
num_bytes: 75535309.0
num_examples: 114
download_size: 1032875305
dataset_size: 1534798219.208
---
# Dataset Card for "dvoice3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/deutschland_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of deutschland/ドイッチュラント/德意志 (Azur Lane)
This is the dataset of deutschland/ドイッチュラント/德意志 (Azur Lane), containing 349 images and their tags.
The core tags of this character are `multicolored_hair, streaked_hair, blue_eyes, black_hair, long_hair, red_hair, white_hair, breasts, bangs, very_long_hair, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 349 | 511.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/deutschland_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 349 | 279.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/deutschland_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 867 | 611.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/deutschland_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 349 | 450.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/deutschland_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 867 | 877.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/deutschland_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/deutschland_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, goggles_on_headwear, grin, iron_cross, looking_at_viewer, peaked_cap, solo, black_headwear, cleavage, sharp_teeth, armband, black_background, simple_background, upper_body |
| 1 | 5 |  |  |  |  |  | 1girl, goggles_on_headwear, grin, holding_gun, iron_cross, long_sleeves, solo, black_headwear, looking_at_viewer, peaked_cap, sharp_teeth, choker, armband, mechanical_hands, simple_background, thigh_strap |
| 2 | 7 |  |  |  |  |  | 1girl, goggles_on_headwear, iron_cross, lace-up_boots, long_sleeves, looking_at_viewer, platform_boots, solo, armband, crossed_legs, holding_gun, knee_boots, sharp_teeth, black_footwear, black_headwear, cleavage, simple_background, white_background, grin, peaked_cap, sitting |
| 3 | 10 |  |  |  |  |  | bare_shoulders, black_bikini, blush, hair_flower, navel, open_mouth, stomach, 1girl, :d, cleavage, looking_at_viewer, medium_breasts, side-tie_bikini_bottom, solo, sunscreen, bare_arms, holding_bottle, outdoors, black_nails, day, hibiscus, pouring_onto_self, sitting, skin_fang, beach, cloud, front-tie_bikini_top, halterneck, lotion_bottle, nail_polish, thighs, blue_sky, sand, collarbone, mole_on_body, ocean, ribbon, twintails |
| 4 | 9 |  |  |  |  |  | 1girl, black_dress, looking_at_viewer, solo, open_mouth, smile, long_sleeves, cross, earrings |
| 5 | 6 |  |  |  |  |  | 1girl, folding_fan, holding_fan, looking_at_viewer, red_dress, smile, solo, thighband_pantyhose, black_pantyhose, china_dress, hair_flower, sleeveless_dress, open_mouth, sitting, bare_shoulders, hair_between_eyes, large_breasts, medium_breasts, red_footwear, shawl |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | goggles_on_headwear | grin | iron_cross | looking_at_viewer | peaked_cap | solo | black_headwear | cleavage | sharp_teeth | armband | black_background | simple_background | upper_body | holding_gun | long_sleeves | choker | mechanical_hands | thigh_strap | lace-up_boots | platform_boots | crossed_legs | knee_boots | black_footwear | white_background | sitting | bare_shoulders | black_bikini | blush | hair_flower | navel | open_mouth | stomach | :d | medium_breasts | side-tie_bikini_bottom | sunscreen | bare_arms | holding_bottle | outdoors | black_nails | day | hibiscus | pouring_onto_self | skin_fang | beach | cloud | front-tie_bikini_top | halterneck | lotion_bottle | nail_polish | thighs | blue_sky | sand | collarbone | mole_on_body | ocean | ribbon | twintails | black_dress | smile | cross | earrings | folding_fan | holding_fan | red_dress | thighband_pantyhose | black_pantyhose | china_dress | sleeveless_dress | hair_between_eyes | large_breasts | red_footwear | shawl |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------------|:-------|:-------------|:--------------------|:-------------|:-------|:-----------------|:-----------|:--------------|:----------|:-------------------|:--------------------|:-------------|:--------------|:---------------|:---------|:-------------------|:--------------|:----------------|:-----------------|:---------------|:-------------|:-----------------|:-------------------|:----------|:-----------------|:---------------|:--------|:--------------|:--------|:-------------|:----------|:-----|:-----------------|:-------------------------|:------------|:------------|:-----------------|:-----------|:--------------|:------|:-----------|:--------------------|:------------|:--------|:--------|:-----------------------|:-------------|:----------------|:--------------|:---------|:-----------|:-------|:-------------|:---------------|:--------|:---------|:------------|:--------------|:--------|:--------|:-----------|:--------------|:--------------|:------------|:----------------------|:------------------|:--------------|:-------------------|:--------------------|:----------------|:---------------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | | X | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | | | X | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | | X | | X | | | | | | | | | | | | | | | | | | | X | X | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X |
|
gkiwi/sd-prompt | ---
license: mit
---
|
kursathalat/ranking_ds | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for ranking_ds
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("kursathalat/ranking_ds")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("kursathalat/ranking_ds")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| prompt | Prompt | text | True | False |
| response1 | Response 1 | text | True | False |
| response2 | Response 2 | text | False | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| preference | Order responses based on your preference | ranking | True | N/A | ['response1', 'response2'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"prompt": "What do you prefer?",
"response1": " \"If you d",
"response2": "on\u0027t wear "
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "525f4274-ebb4-4aee-a116-f8b422b2e2b4",
"values": {
"preference": {
"value": [
{
"rank": 1,
"value": "response1"
},
{
"rank": 2,
"value": "response2"
}
]
}
}
},
{
"status": "submitted",
"user_id": "464c365f-9fd7-4db9-a06f-42c320c54547",
"values": {
"preference": {
"value": [
{
"rank": 2,
"value": "response1"
},
{
"rank": 1,
"value": "response2"
}
]
}
}
}
],
"suggestions": [
{
"agent": null,
"question_name": "preference",
"score": null,
"type": null,
"value": [
{
"rank": 1,
"value": "response1"
},
{
"rank": 2,
"value": "response2"
}
]
}
],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{}",
"preference": [
{
"status": "submitted",
"user_id": "525f4274-ebb4-4aee-a116-f8b422b2e2b4",
"value": {
"rank": [
1,
2
],
"value": [
"response1",
"response2"
]
}
},
{
"status": "submitted",
"user_id": "464c365f-9fd7-4db9-a06f-42c320c54547",
"value": {
"rank": [
2,
1
],
"value": [
"response1",
"response2"
]
}
}
],
"preference-suggestion": {
"rank": [
1,
2
],
"value": [
"response1",
"response2"
]
},
"preference-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"prompt": "What do you prefer?",
"response1": " \"If you d",
"response2": "on\u0027t wear "
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **prompt** is of type `text`.
* **response1** is of type `text`.
* (optional) **response2** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **preference** is of type `ranking` with the following allowed values ['response1', 'response2'].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **preference-suggestion** is of type `ranking` with the following allowed values ['response1', 'response2'].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
This is a preference dataset that contains contexts and options. Please rank the options that you would prefer in the given context.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_macroeconomics-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 6771
num_examples: 5
- name: test
num_bytes: 3153207
num_examples: 390
download_size: 281037
dataset_size: 3159978
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-high_school_macroeconomics-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/autotrain-data-wikiart-sample2 | Invalid username or password. |
open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k | ---
pretty_name: Evaluation run of mncai/Mistral-7B-OpenOrca-1k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/Mistral-7B-OpenOrca-1k](https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T07:41:12.101153](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k/blob/main/results_2023-10-25T07-41-12.101153.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0053481543624161075,\n\
\ \"em_stderr\": 0.0007469252903319289,\n \"f1\": 0.09739828020134218,\n\
\ \"f1_stderr\": 0.001857285751420582,\n \"acc\": 0.45294831833688076,\n\
\ \"acc_stderr\": 0.01023434017882167\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0053481543624161075,\n \"em_stderr\": 0.0007469252903319289,\n\
\ \"f1\": 0.09739828020134218,\n \"f1_stderr\": 0.001857285751420582\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \
\ \"acc_stderr\": 0.008944213403553095\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090247\n\
\ }\n}\n```"
repo_url: https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T07_41_12.101153
path:
- '**/details_harness|drop|3_2023-10-25T07-41-12.101153.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T07-41-12.101153.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T07_41_12.101153
path:
- '**/details_harness|gsm8k|5_2023-10-25T07-41-12.101153.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T07-41-12.101153.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T07_41_12.101153
path:
- '**/details_harness|winogrande|5_2023-10-25T07-41-12.101153.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T07-41-12.101153.parquet'
- config_name: results
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- results_2023-10-10T11-19-13.410150.parquet
- split: 2023_10_25T07_41_12.101153
path:
- results_2023-10-25T07-41-12.101153.parquet
- split: latest
path:
- results_2023-10-25T07-41-12.101153.parquet
---
# Dataset Card for Evaluation run of mncai/Mistral-7B-OpenOrca-1k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/Mistral-7B-OpenOrca-1k](https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T07:41:12.101153](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k/blob/main/results_2023-10-25T07-41-12.101153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0053481543624161075,
"em_stderr": 0.0007469252903319289,
"f1": 0.09739828020134218,
"f1_stderr": 0.001857285751420582,
"acc": 0.45294831833688076,
"acc_stderr": 0.01023434017882167
},
"harness|drop|3": {
"em": 0.0053481543624161075,
"em_stderr": 0.0007469252903319289,
"f1": 0.09739828020134218,
"f1_stderr": 0.001857285751420582
},
"harness|gsm8k|5": {
"acc": 0.1197877179681577,
"acc_stderr": 0.008944213403553095
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090247
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DavidMOBrien/small_benchmark_webmarket | ---
dataset_info:
features:
- name: before
dtype: string
- name: after
dtype: string
- name: loc
dtype: int64
- name: repo
dtype: string
splits:
- name: train
num_bytes: 36670
num_examples: 23
download_size: 24389
dataset_size: 36670
---
# Dataset Card for "small_benchmark_webmarket"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sina-Alinejad-2002/multiply_operation_prediction | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 38758
num_examples: 37
- name: validation
num_bytes: 3282
num_examples: 3
download_size: 43649
dataset_size: 42040
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
shossain/merged-no-pad-text-16384 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 372439533
num_examples: 6401
download_size: 184155020
dataset_size: 372439533
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "merged-no-pad-text-16384"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
erfanzar/LinguaMatic-Mixin | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 168057889
num_examples: 69765
download_size: 88116993
dataset_size: 168057889
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
- es
- ru
- de
- pl
- th
- vi
- sv
- bn
- da
- he
- it
- fa
- sk
- id
- nb
- el
- nl
- hu
- eu
- zh
- eo
- ja
- ca
- cs
- bg
- fi
- pt
- tr
- ro
- ar
- uk
- gl
- fr
- ko
tags:
- code
- biology
- medical
size_categories:
- 1M<n<10M
task_categories:
- text-generation
- text-classification
- conversational
---
# Dataset Card for "UltraChat-Mixin"
# UltraChat-Mixin Dataset
## Overview
llama 2 prompted style frin
### ChatMatic
ChatMatic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using GPT-4 and contains System messages Dialogs and conv_depth more than 5 with higher sequence lengths Used datasets are:
"oasst2"
"ise-uiuc/Magicoder-Evol-Instruct-110K"
"vicgalle/alpaca-gpt4"
"LDJnr/Capybara"
## Dataset Configuration
The dataset is configured as follows:
```yaml
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 168057889
num_examples: 69765
download_size: 88116993
dataset_size: 168057889
```
## Features
The UltraChat-Mixin dataset consists of the following features:
- **prompt**: A sequence of strings representing the conversation dialog Llama2 Style prompts.
## Splits
The dataset contains a single split:
- **train**: This split is used for training conversational AI models. It consists of 70_000 examples and has a size of approximately 168,057,889 bytes.
## Download Size
The download size of the UltraChat-Mixin dataset is approximately 88,116,993 bytes.
## Dataset Size
The total size of the UltraChat-Mixin dataset is approximately 168,057,889 bytes.
Please note that the dataset configuration and statistics provided above are based on the information provided by Erfan zare chavoshi.
|
distilled-from-one-sec-cv12/chunk_27 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 898023020
num_examples: 174985
download_size: 917718427
dataset_size: 898023020
---
# Dataset Card for "chunk_27"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__feed-sen_vi-b48d12-2175169954 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b1
metrics: []
dataset_name: futin/feed
dataset_config: sen_vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b1
* Dataset: futin/feed
* Config: sen_vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
wars08/emp | ---
license: openrail
---
|
omarelsayeed/2000dataset | ---
dataset_info:
features:
- name: input_values
struct:
- name: attention_mask
sequence:
sequence: int32
- name: input_values
sequence:
sequence: float32
- name: input_length
dtype: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1200580612
num_examples: 2001
download_size: 536444205
dataset_size: 1200580612
---
# Dataset Card for "2000dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
L33tcode/mini-sahaj-v1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 5295
num_examples: 19
download_size: 5548
dataset_size: 5295
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
isaquecerqueira/millan_internet_traffic | ---
license: odbl
---
# Milan Internet Traffic Dataset
This dataset contains information about hourly internet traffic in Milan between 2013-11-01 and 2014-01-01. |
ittailup/lallama-orca-small | ---
dataset_info:
features:
- name: question
dtype: string
- name: response
dtype: string
- name: system_prompt
dtype: string
splits:
- name: train
num_bytes: 182851981
num_examples: 24000
download_size: 95835294
dataset_size: 182851981
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "lallama-orca-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ckotait/ATRDataset | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 674327851.666
num_examples: 16706
- name: validation
num_bytes: 46935738.0
num_examples: 1000
- name: test
num_bytes: 16859858.0
num_examples: 200
download_size: 813600043
dataset_size: 738123447.666
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for Dataset Name
The Active Template Regression (ATR) dataset comprises 18 semantic category labels, including face, sunglasses, hat, scarf, hair, upper clothes, left arm, right arm, belt, pants, left leg, right leg, skirt, left shoe, right shoe, bag, dress, and background. A total of 17,700 images were incorporated into the ATR dataset. 16,700 images were designated for training, and 1,000 for testing.
- **Curated by:** Xiaodan Liang, Si Liu, Xiaohui Shen, Jianchao Yang, Luoqi Liu, Jian Dong, Liang Lin, Shuicheng Yan
- **Shared by:** Xiaodan Liang, Si Liu, Xiaohui Shen, Jianchao Yang, Luoqi Liu, Jian Dong, Liang Lin, Shuicheng Yan
- **License:** MIT
# Dataset Sources
- **Repository:** https://github.com/lemondan/HumanParsing-Dataset
- **Paper:** Deep Human Parsing with Active Template Regression
# Human Parsing Labels
- 0: **background**
- 1: **hat**
- 2: **hair**
- 3: **sunglasses**
- 4: **upperclothes**
- 5: **skirt**
- 6: **pants**
- 7: **dress**
- 8: **belt**
- 9: **leftshoe**
- 10: **rightshoe**
- 11: **face**
- 12: **leftleg**
- 13: **rightleg**
- 14: **leftarm**
- 15: **rightarm**
- 16: **bag**
- 17: **scarf**
# Uses
Semantic segmentation, and more specifically, human body parsing.
# Dataset Card Authors
Christian Kotait
**BibTeX:**
@article{liang2015deep,
title={Deep human parsing with active template regression},
author={Liang, Xiaodan and Liu, Si and Shen, Xiaohui and Yang, Jianchao and Liu, Luoqi and Dong, Jian and Lin, Liang and Yan, Shuicheng},
journal={IEEE transactions on pattern analysis and machine intelligence},
volume={37},
number={12},
pages={2402--2414},
year={2015},
publisher={IEEE}
}
|
alizeepace/rejection_sampling_phi_2_OA_rm | ---
license: apache-2.0
language:
- en
size_categories:
- n<1K
multilinguality:
- monolingual
pretty_name: Rejection Sampling Phi-2 with OpenAssistant RM
annotations_creators:
- machine-generated
language_creators:
- expert-generated
- machine-generated
source_datasets:
- GSM8K
task_categories:
- question-answering
- text-generation
- conversational
---
# Dataset Card for Rejection Sampling Phi-2 with OpenAssistant RM
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Point of Contact:** [Alizée Pace](mailto:alizee.pace@ai.ethz.ch)
### Dataset Summary
The "Rejection Sampling Phi-2 with OpenAssistant RM" dataset consists of 10 pairs of prompts and responses, which were generated using rejection sampling over 10 Phi-2 generation using the OpenAssistant Reward Model.
### Supported Tasks and Leaderboards
The dataset and its creation rationale could be used to support models for question-answering, text-generation, or conversational agents.
### Languages
The BCP-47 code for English as generally spoken in the United States is en-US and the BCP-47 code for English as generally spoken in the United Kingdom is en-GB. It is unknown if other varieties of English are represented in the data.
## Dataset Structure
### Data Instances
For each instance, there is a list of strings for the model prompt, a string for the best generation, and a float for the reward model score.
```
{'prompt': ['Write a story about a lion that lives in the city and falls in love with a pigeon.'],
'generation': 'Instruction: Write a story about a lion that lives in the city and falls in love with a pigeon.\nOutput: Leo was a lion who lived in the city. He hated the noise and the pollution. He longed for the freedom and the beauty of the savanna. He spent his days wandering around the streets, looking for prey or a place to nap.\n\nOne day, he spotted a pigeon sitting on a ledge. He was hungry and decided to try to catch it. He roared loudly, hoping to scare the bird away. But the pigeon was not afraid. It flew up and landed on his nose.\n\nLeo was surprised and annoyed. He puffed up his mane and tried to shake the pigeon off. But the pigeon was persistent. It pecked at his nose and made jokes.\n\nLeo was irritated and bored. He decided to play along. He rubbed his nose and made funny faces. The pigeon laughed and joined in. They became friends and had fun together.\n\nLeo realized that the pigeon was not so bad. It was actually quite smart and funny. It taught him new words and tricks. It showed him new sights and sounds. It made him forget about his problems and enjoy the moment.\n\nLeo fell in love with the pigeon. He wanted to be with it forever. He asked it to leave the city and follow him to the savanna. The pigeon agreed. It said it would always be his friend and his mate.\n\nLeo and the pigeon left the city and embarked on a new adventure. They found happiness and peace in the wild. They lived happily ever after.\n',
'rm_score': 3.1899418831}
```
### Data Fields
- `prompt`: a list of strings containing the prompts used to query Phi-2, with each string representing a turn.
- `generation`: a string containing the best-scored Phi-2 generation out of 10, including the formatted prompt.
- `rm_score`: a float corresponding to the OpenAssistant reward score assigned to the generation.
### Data Splits
The dataset is not divided into splits.
## Dataset Creation
### Curation Rationale
[N/A]
### Source Data
#### Initial Data Collection and Normalization
Prompts were written or selected by a human to capture a range of instruction-following abilities (creativity, factuality, reasoning, etc.).
Generations were produced for each prompt using the [Phi-2](https://huggingface.co/microsoft/phi-2) language model. Following the Phi-2 QA and chat prompting instructions, prompts were formatted with the following rationale:
* single-turn queries: `"Instruction: {query}\nOutput: "`
* multi-turn queries: `"Alice: {query_turn_1}\nBob: {query_turn_2}...\nAlice: {query_turn_n}\nBob: "`
Only the top-scored out of 10 generations were retained for this dataset; see [Annotations](#annotations) for details on the scoring procedure.
#### Who are the source language producers?
Two out of the 10 prompts were taken from the [GSM8K](https://huggingface.co/datasets/gsm8k) dataset to probe reasoning ability. The rest were generated by a human writer.
Generations were produced for each prompt using the [Phi-2](https://huggingface.co/microsoft/phi-2) language model.
### Annotations
#### Annotation process
All generations are scored using an external reward model.
#### Who are the annotators?
Reward scores were produced for each generation using the [Open Assistant reward model](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large-v2).
### Personal and Sensitive Information
No personal and sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
Some prompts are deliberately designed to probe models' ability to generate or reward safe and harmless text. Prompts and generation may therefore contain offensive or toxic language.
### Discussion of Biases
[N/A]
### Other Known Limitations
The limited size of the dataset does not allow to thoroughly explore all aspects of model instruction-following ability.
## Additional Information
### Dataset Curators
The dataset was collected by Alizée Pace.
### Licensing Information
The dataset is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
[N/A]
### Contributions
Thanks to [@alizeepace](https://github.com/alizeepace) for adding this dataset. |
open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v1.0 | ---
pretty_name: Evaluation run of Changgil/K2S3-SOLAR-11b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Changgil/K2S3-SOLAR-11b-v1.0](https://huggingface.co/Changgil/K2S3-SOLAR-11b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T08:16:12.721232](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v1.0/blob/main/results_2024-03-03T08-16-12.721232.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3019325028575802,\n\
\ \"acc_stderr\": 0.032389857676374596,\n \"acc_norm\": 0.30472238961229525,\n\
\ \"acc_norm_stderr\": 0.03320572038243745,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4599370863671749,\n\
\ \"mc2_stderr\": 0.0152829942731636\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30716723549488056,\n \"acc_stderr\": 0.013481034054980945,\n\
\ \"acc_norm\": 0.3370307167235495,\n \"acc_norm_stderr\": 0.013813476652902272\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.39404501095399325,\n\
\ \"acc_stderr\": 0.004876459434619797,\n \"acc_norm\": 0.5139414459271061,\n\
\ \"acc_norm_stderr\": 0.004987841367402512\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343602,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343602\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.32452830188679244,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.32452830188679244,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.045126085985421255,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421255\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n\
\ \"acc_stderr\": 0.029957851329869334,\n \"acc_norm\": 0.1907514450867052,\n\
\ \"acc_norm_stderr\": 0.029957851329869334\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095455,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095455\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.039325376803928704,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.039325376803928704\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3032258064516129,\n\
\ \"acc_stderr\": 0.026148685930671746,\n \"acc_norm\": 0.3032258064516129,\n\
\ \"acc_norm_stderr\": 0.026148685930671746\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.38181818181818183,\n \"acc_stderr\": 0.03793713171165635,\n\
\ \"acc_norm\": 0.38181818181818183,\n \"acc_norm_stderr\": 0.03793713171165635\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.29797979797979796,\n \"acc_stderr\": 0.032586303838365555,\n \"\
acc_norm\": 0.29797979797979796,\n \"acc_norm_stderr\": 0.032586303838365555\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935411,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935411\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2846153846153846,\n \"acc_stderr\": 0.022878322799706283,\n\
\ \"acc_norm\": 0.2846153846153846,\n \"acc_norm_stderr\": 0.022878322799706283\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.29541284403669726,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\"\
: 0.29541284403669726,\n \"acc_norm_stderr\": 0.019560619182976\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.02769691071309394,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.02769691071309394\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03308611113236434,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03308611113236434\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3924050632911392,\n \"acc_stderr\": 0.0317847187456473,\n \
\ \"acc_norm\": 0.3924050632911392,\n \"acc_norm_stderr\": 0.0317847187456473\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.32061068702290074,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.32061068702290074,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857144,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857144\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.41452991452991456,\n\
\ \"acc_stderr\": 0.03227396567623778,\n \"acc_norm\": 0.41452991452991456,\n\
\ \"acc_norm_stderr\": 0.03227396567623778\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3486590038314176,\n\
\ \"acc_stderr\": 0.01704124314349094,\n \"acc_norm\": 0.3486590038314176,\n\
\ \"acc_norm_stderr\": 0.01704124314349094\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.025190181327608415,\n\
\ \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.025190181327608415\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.014149575348976257,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.014149575348976257\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.026992544339297243,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.026992544339297243\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.026229649178821163,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.026229649178821163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.02718712701150379,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.02718712701150379\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2900912646675359,\n\
\ \"acc_stderr\": 0.011590375554733093,\n \"acc_norm\": 0.2900912646675359,\n\
\ \"acc_norm_stderr\": 0.011590375554733093\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.024880971512294268,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.024880971512294268\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27124183006535946,\n \"acc_stderr\": 0.01798661530403031,\n \
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.01798661530403031\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3795918367346939,\n \"acc_stderr\": 0.03106721126287248,\n\
\ \"acc_norm\": 0.3795918367346939,\n \"acc_norm_stderr\": 0.03106721126287248\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.32338308457711445,\n\
\ \"acc_stderr\": 0.03307615947979035,\n \"acc_norm\": 0.32338308457711445,\n\
\ \"acc_norm_stderr\": 0.03307615947979035\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4599370863671749,\n\
\ \"mc2_stderr\": 0.0152829942731636\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5753749013417522,\n \"acc_stderr\": 0.013891893150264225\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \
\ \"acc_stderr\": 0.0031957470754808135\n }\n}\n```"
repo_url: https://huggingface.co/Changgil/K2S3-SOLAR-11b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|arc:challenge|25_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|gsm8k|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hellaswag|10_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-16-12.721232.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T08-16-12.721232.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- '**/details_harness|winogrande|5_2024-03-03T08-16-12.721232.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T08-16-12.721232.parquet'
- config_name: results
data_files:
- split: 2024_03_03T08_16_12.721232
path:
- results_2024-03-03T08-16-12.721232.parquet
- split: latest
path:
- results_2024-03-03T08-16-12.721232.parquet
---
# Dataset Card for Evaluation run of Changgil/K2S3-SOLAR-11b-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Changgil/K2S3-SOLAR-11b-v1.0](https://huggingface.co/Changgil/K2S3-SOLAR-11b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T08:16:12.721232](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__K2S3-SOLAR-11b-v1.0/blob/main/results_2024-03-03T08-16-12.721232.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3019325028575802,
"acc_stderr": 0.032389857676374596,
"acc_norm": 0.30472238961229525,
"acc_norm_stderr": 0.03320572038243745,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4599370863671749,
"mc2_stderr": 0.0152829942731636
},
"harness|arc:challenge|25": {
"acc": 0.30716723549488056,
"acc_stderr": 0.013481034054980945,
"acc_norm": 0.3370307167235495,
"acc_norm_stderr": 0.013813476652902272
},
"harness|hellaswag|10": {
"acc": 0.39404501095399325,
"acc_stderr": 0.004876459434619797,
"acc_norm": 0.5139414459271061,
"acc_norm_stderr": 0.004987841367402512
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343602,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343602
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.32452830188679244,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.32452830188679244,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421255,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421255
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.029957851329869334,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.029957851329869334
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928704,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3032258064516129,
"acc_stderr": 0.026148685930671746,
"acc_norm": 0.3032258064516129,
"acc_norm_stderr": 0.026148685930671746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.03793713171165635,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.03793713171165635
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29797979797979796,
"acc_stderr": 0.032586303838365555,
"acc_norm": 0.29797979797979796,
"acc_norm_stderr": 0.032586303838365555
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935411,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935411
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2846153846153846,
"acc_stderr": 0.022878322799706283,
"acc_norm": 0.2846153846153846,
"acc_norm_stderr": 0.022878322799706283
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29541284403669726,
"acc_stderr": 0.019560619182976,
"acc_norm": 0.29541284403669726,
"acc_norm_stderr": 0.019560619182976
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.02769691071309394,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.02769691071309394
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03308611113236434,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03308611113236434
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3924050632911392,
"acc_stderr": 0.0317847187456473,
"acc_norm": 0.3924050632911392,
"acc_norm_stderr": 0.0317847187456473
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.32061068702290074,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.32061068702290074,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4132231404958678,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.4132231404958678,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857144,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857144
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.41452991452991456,
"acc_stderr": 0.03227396567623778,
"acc_norm": 0.41452991452991456,
"acc_norm_stderr": 0.03227396567623778
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3486590038314176,
"acc_stderr": 0.01704124314349094,
"acc_norm": 0.3486590038314176,
"acc_norm_stderr": 0.01704124314349094
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.025190181327608415,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.025190181327608415
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.014149575348976257,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.014149575348976257
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.026992544339297243,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.026992544339297243
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.026229649178821163,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.026229649178821163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.02718712701150379,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.02718712701150379
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2900912646675359,
"acc_stderr": 0.011590375554733093,
"acc_norm": 0.2900912646675359,
"acc_norm_stderr": 0.011590375554733093
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.024880971512294268,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.024880971512294268
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.01798661530403031,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.01798661530403031
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3795918367346939,
"acc_stderr": 0.03106721126287248,
"acc_norm": 0.3795918367346939,
"acc_norm_stderr": 0.03106721126287248
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.32338308457711445,
"acc_stderr": 0.03307615947979035,
"acc_norm": 0.32338308457711445,
"acc_norm_stderr": 0.03307615947979035
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4599370863671749,
"mc2_stderr": 0.0152829942731636
},
"harness|winogrande|5": {
"acc": 0.5753749013417522,
"acc_stderr": 0.013891893150264225
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.0031957470754808135
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_golaxy__gogpt2-13b | ---
pretty_name: Evaluation run of golaxy/gogpt2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [golaxy/gogpt2-13b](https://huggingface.co/golaxy/gogpt2-13b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_golaxy__gogpt2-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T09:08:59.644904](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt2-13b/blob/main/results_2023-10-23T09-08-59.644904.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.25041946308724833,\n\
\ \"em_stderr\": 0.004436932707454965,\n \"f1\": 0.319959102348994,\n\
\ \"f1_stderr\": 0.004400567822301105,\n \"acc\": 0.3496193990687978,\n\
\ \"acc_stderr\": 0.00851200635523702\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.25041946308724833,\n \"em_stderr\": 0.004436932707454965,\n\
\ \"f1\": 0.319959102348994,\n \"f1_stderr\": 0.004400567822301105\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02047005307050796,\n \
\ \"acc_stderr\": 0.003900413385915719\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6787687450670876,\n \"acc_stderr\": 0.013123599324558321\n\
\ }\n}\n```"
repo_url: https://huggingface.co/golaxy/gogpt2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T09_08_59.644904
path:
- '**/details_harness|drop|3_2023-10-23T09-08-59.644904.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T09-08-59.644904.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T09_08_59.644904
path:
- '**/details_harness|gsm8k|5_2023-10-23T09-08-59.644904.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T09-08-59.644904.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T09_08_59.644904
path:
- '**/details_harness|winogrande|5_2023-10-23T09-08-59.644904.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T09-08-59.644904.parquet'
- config_name: results
data_files:
- split: 2023_10_23T09_08_59.644904
path:
- results_2023-10-23T09-08-59.644904.parquet
- split: latest
path:
- results_2023-10-23T09-08-59.644904.parquet
---
# Dataset Card for Evaluation run of golaxy/gogpt2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/golaxy/gogpt2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [golaxy/gogpt2-13b](https://huggingface.co/golaxy/gogpt2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_golaxy__gogpt2-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T09:08:59.644904](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt2-13b/blob/main/results_2023-10-23T09-08-59.644904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.25041946308724833,
"em_stderr": 0.004436932707454965,
"f1": 0.319959102348994,
"f1_stderr": 0.004400567822301105,
"acc": 0.3496193990687978,
"acc_stderr": 0.00851200635523702
},
"harness|drop|3": {
"em": 0.25041946308724833,
"em_stderr": 0.004436932707454965,
"f1": 0.319959102348994,
"f1_stderr": 0.004400567822301105
},
"harness|gsm8k|5": {
"acc": 0.02047005307050796,
"acc_stderr": 0.003900413385915719
},
"harness|winogrande|5": {
"acc": 0.6787687450670876,
"acc_stderr": 0.013123599324558321
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-87500 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 16679550990
num_examples: 2500
download_size: 3260969259
dataset_size: 16679550990
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xuebingA/global-binary-image | ---
dataset_info:
features:
- name: image
dtype: image
- name: binary_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 72294091784.5
num_examples: 200026
download_size: 56366537032
dataset_size: 72294091784.5
---
# Dataset Card for "global-binary-image"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rubend18/Motivacion-Diaria | ---
task_categories:
- text-classification
- token-classification
- text-generation
language:
- es
tags:
- motivacion
- diaria
- motivaciondiaria
- coach
- coaching
pretty_name: Motivación Diaria
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Name
Motivación Diaria
## Dataset Description
- **Autor:** Rubén Darío Jaramillo
- **Email:** rubend18@hotmail.com
- **WhatsApp:** +593 93 979 6676
### Dataset Summary
Scrapeado de http://www.motivaciondiaria.com/
### Languages
[Spanish] |
El-chapoo/Complex_data | ---
dataset_info:
features:
- name: message_2
dtype: string
- name: model_chosen
dtype: string
- name: answer
dtype: string
- name: markdown
dtype: string
- name: emoji_therapist
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 920839364
num_examples: 351239
download_size: 368992555
dataset_size: 920839364
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sky2004/05 | ---
license: other
---
|
thesistranslation/distilled-ccmatrix-en-es | ---
dataset_info:
features:
- name: id
dtype: int32
- name: translation
dtype:
translation:
languages:
- en
- es
splits:
- name: train
num_bytes: 7084246805
num_examples: 30000000
download_size: 4913968666
dataset_size: 7084246805
language:
- es
- en
---
# Dataset Card for "distilled-ccmatrix-en-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aengusl/fully_clean_helpful_only-v1.0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 138921498.65170428
num_examples: 231343
download_size: 77727046
dataset_size: 138921498.65170428
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
argilla/llama-2-banking-preference | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for llama-2-banking-preference
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/llama-2-banking-preference")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/llama-2-banking-preference")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/guides/llms/conceptual_guides/data_model.html) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| request | User request | TextField | True | False |
| response-1 | Assistant response 1 | TextField | True | True |
| response-2 | Assistant response 2 | TextField | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, single choice, or multiple choice.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| preference | Order response based on your preference | RankingQuestion | True | 1 = most accurate and helpful, 2 = worst. Ties are allowed. | ['response-1', 'response-2'] |
| correct-response | If both responses are innacurate, complete the response: | TextQuestion | True | N/A | N/A |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"request": "Do I get charged if i am getting money?",
"response-1": " Of course! Thank you for reaching out to us at the Bank of Tillamook. We\u0027re here to help you with any questions or concerns you may have about your bank account. When it comes to fees associated with withdrawals, we do charge a small fee for certain types of transactions. However, please note that there is no charge for accessing your own funds in our bank through one of our ATMs or by using your debit card for purchases. If you could provide me with more details on what kind of transaction you were trying to make, I would be happy to assist you further and explain any potential fees associated with those services. Is there anything else I can help you with today?",
"response-2": " Of course! Thank you for choosing the Bank of Tillamook for your financial needs. We appreciate your business and value your trust in us. To answer your question, no, you will not be charged for withdrawing money from your own account at the Bank of Tillamook. Our bank is committed to providing convenient and fee-free services to our customers, so you can access your funds without worrying about additional charges. If you have any further questions or concerns, please don\u0027t hesitate to ask. Is there anything else I can help you with today?"
},
"id": null,
"metadata": {},
"responses": [],
"suggestions": []
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"correct-response": null,
"correct-response-suggestion": null,
"correct-response-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"metadata": null,
"preference": null,
"preference-suggestion": null,
"preference-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"request": "Do I get charged if i am getting money?",
"response-1": " Of course! Thank you for reaching out to us at the Bank of Tillamook. We\u0027re here to help you with any questions or concerns you may have about your bank account. When it comes to fees associated with withdrawals, we do charge a small fee for certain types of transactions. However, please note that there is no charge for accessing your own funds in our bank through one of our ATMs or by using your debit card for purchases. If you could provide me with more details on what kind of transaction you were trying to make, I would be happy to assist you further and explain any potential fees associated with those services. Is there anything else I can help you with today?",
"response-2": " Of course! Thank you for choosing the Bank of Tillamook for your financial needs. We appreciate your business and value your trust in us. To answer your question, no, you will not be charged for withdrawing money from your own account at the Bank of Tillamook. Our bank is committed to providing convenient and fee-free services to our customers, so you can access your funds without worrying about additional charges. If you have any further questions or concerns, please don\u0027t hesitate to ask. Is there anything else I can help you with today?"
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **request** is of type `TextField`.
* **response-1** is of type `TextField`.
* **response-2** is of type `TextField`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **preference** is of type `RankingQuestion` with the following allowed values ['response-1', 'response-2'], and description "1 = most accurate and helpful, 2 = worst. Ties are allowed.".
* (optional) **correct-response** is of type `TextQuestion`.
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **preference-suggestion** is of type `ranking` with the following allowed values ['response-1', 'response-2'].
* (optional) **correct-response-suggestion** is of type `text`.
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Please, read the question carefully and try to answer it as accurately as possible.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Multimodal-Fatima/VQAv2Validation_ViT_L_14_A_T_C_Q_benchmarks | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 98390484
num_examples: 214354
download_size: 19492982
dataset_size: 98390484
---
# Dataset Card for "VQAv2Validation_ViT_L_14_A_T_C_Q_benchmarks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MysticPixel42/erotic_teens_unconditional | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 108314958.264
num_examples: 4632
download_size: 105664309
dataset_size: 108314958.264
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
StDestiny/icsi_cleaned | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4642864.677966102
num_examples: 53
- name: test
num_bytes: 525607.3220338983
num_examples: 6
download_size: 2739229
dataset_size: 5168472.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kraina/airbnb_london_weekends | ---
dataset_info:
features:
- name: _id
dtype: string
- name: realSum
dtype: float64
- name: room_type
dtype: string
- name: room_shared
dtype: bool
- name: room_private
dtype: bool
- name: person_capacity
dtype: float64
- name: host_is_superhost
dtype: bool
- name: multi
dtype: int64
- name: biz
dtype: int64
- name: cleanliness_rating
dtype: float64
- name: guest_satisfaction_overall
dtype: float64
- name: bedrooms
dtype: int64
- name: dist
dtype: float64
- name: metro_dist
dtype: float64
- name: attr_index
dtype: float64
- name: attr_index_norm
dtype: float64
- name: rest_index
dtype: float64
- name: rest_index_norm
dtype: float64
splits:
- name: train
num_bytes: 703844.4180180868
num_examples: 5379
download_size: 407036
dataset_size: 703844.4180180868
---
# Dataset Card for "airbnb_london_weekends"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
santifiorino/spinetta | ---
task_categories:
- audio-to-audio
language:
- ar
tags:
- music
- spinetta
- rock
- spanish
pretty_name: spinetta
license: artistic-2.0
---
Samples de ~10-15 segundos de Luis Alberto Spinetta cantando.
Limpio, sin instrumentos y sin silencios.
Canciones de Pescado Rabioso, Almendra, Invisible y como solista. |
9wimu9/wiki_support_docs_sin | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: uid
dtype: int64
- name: support_documents
sequence: string
splits:
- name: train
num_bytes: 1607575704
num_examples: 142968
download_size: 648841769
dataset_size: 1607575704
---
# Dataset Card for "wiki_support_docs_sin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Musgrok/SIC | ---
license: mit
---
|
open-llm-leaderboard/details_migtissera__Tess-72B-v1.5b | ---
pretty_name: Evaluation run of migtissera/Tess-72B-v1.5b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Tess-72B-v1.5b](https://huggingface.co/migtissera/Tess-72B-v1.5b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-72B-v1.5b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T06:02:33.204019](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-72B-v1.5b/blob/main/results_2024-02-10T06-02-33.204019.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7641559275832511,\n\
\ \"acc_stderr\": 0.02832639344225231,\n \"acc_norm\": 0.7668756521164218,\n\
\ \"acc_norm_stderr\": 0.02887897070184607,\n \"mc1\": 0.5618115055079559,\n\
\ \"mc1_stderr\": 0.017369236164404417,\n \"mc2\": 0.7198677184860963,\n\
\ \"mc2_stderr\": 0.014465480359002823\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623503,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.664708225453097,\n\
\ \"acc_stderr\": 0.00471127540813842,\n \"acc_norm\": 0.8553077076279626,\n\
\ \"acc_norm_stderr\": 0.0035107136507595417\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n\
\ \"acc_stderr\": 0.03972552884785136,\n \"acc_norm\": 0.6962962962962963,\n\
\ \"acc_norm_stderr\": 0.03972552884785136\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474924,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.83,\n\
\ \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.023508739218846938,\n\
\ \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.023508739218846938\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n\
\ \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n\
\ \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.049512182523962604,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.049512182523962604\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7702127659574468,\n \"acc_stderr\": 0.02750175294441242,\n\
\ \"acc_norm\": 0.7702127659574468,\n \"acc_norm_stderr\": 0.02750175294441242\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n\
\ \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.6228070175438597,\n\
\ \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6878306878306878,\n \"acc_stderr\": 0.02386520683697259,\n \"\
acc_norm\": 0.6878306878306878,\n \"acc_norm_stderr\": 0.02386520683697259\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8709677419354839,\n \"acc_stderr\": 0.019070889254792753,\n \"\
acc_norm\": 0.8709677419354839,\n \"acc_norm_stderr\": 0.019070889254792753\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"\
acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\"\
: 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.02654443531270646,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.02654443531270646\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723332,\n \"\
acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723332\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246804,\n\
\ \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246804\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45925925925925926,\n \"acc_stderr\": 0.030384169232350818,\n \
\ \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.030384169232350818\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.024044054940440488,\n\
\ \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.024044054940440488\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5562913907284768,\n \"acc_stderr\": 0.04056527902281733,\n \"\
acc_norm\": 0.5562913907284768,\n \"acc_norm_stderr\": 0.04056527902281733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571762,\n \"\
acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571762\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640262,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640262\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237104,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n\
\ \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n\
\ \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446914,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446914\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9233716475095786,\n\
\ \"acc_stderr\": 0.00951217069932386,\n \"acc_norm\": 0.9233716475095786,\n\
\ \"acc_norm_stderr\": 0.00951217069932386\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442262,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442262\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6558659217877095,\n\
\ \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.6558659217877095,\n\
\ \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.020823758837580916,\n\
\ \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.020823758837580916\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8070739549839229,\n\
\ \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.8070739549839229,\n\
\ \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.01830386880689179,\n\
\ \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.01830386880689179\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6382978723404256,\n \"acc_stderr\": 0.028663820147199485,\n \
\ \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.028663820147199485\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6108213820078227,\n\
\ \"acc_stderr\": 0.012452613934287015,\n \"acc_norm\": 0.6108213820078227,\n\
\ \"acc_norm_stderr\": 0.012452613934287015\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8161764705882353,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.8161764705882353,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8251633986928104,\n \"acc_stderr\": 0.01536616706478065,\n \
\ \"acc_norm\": 0.8251633986928104,\n \"acc_norm_stderr\": 0.01536616706478065\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.7636363636363637,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n\
\ \"acc_stderr\": 0.02068718695153409,\n \"acc_norm\": 0.9054726368159204,\n\
\ \"acc_norm_stderr\": 0.02068718695153409\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n\
\ \"mc1_stderr\": 0.017369236164404417,\n \"mc2\": 0.7198677184860963,\n\
\ \"mc2_stderr\": 0.014465480359002823\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7695223654283548,\n \
\ \"acc_stderr\": 0.01160024902059582\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Tess-72B-v1.5b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|arc:challenge|25_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|gsm8k|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hellaswag|10_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T06-02-33.204019.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T06-02-33.204019.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- '**/details_harness|winogrande|5_2024-02-10T06-02-33.204019.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T06-02-33.204019.parquet'
- config_name: results
data_files:
- split: 2024_02_10T06_02_33.204019
path:
- results_2024-02-10T06-02-33.204019.parquet
- split: latest
path:
- results_2024-02-10T06-02-33.204019.parquet
---
# Dataset Card for Evaluation run of migtissera/Tess-72B-v1.5b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [migtissera/Tess-72B-v1.5b](https://huggingface.co/migtissera/Tess-72B-v1.5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-72B-v1.5b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T06:02:33.204019](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-72B-v1.5b/blob/main/results_2024-02-10T06-02-33.204019.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7641559275832511,
"acc_stderr": 0.02832639344225231,
"acc_norm": 0.7668756521164218,
"acc_norm_stderr": 0.02887897070184607,
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404417,
"mc2": 0.7198677184860963,
"mc2_stderr": 0.014465480359002823
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623503,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.664708225453097,
"acc_stderr": 0.00471127540813842,
"acc_norm": 0.8553077076279626,
"acc_norm_stderr": 0.0035107136507595417
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785136,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785136
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474924,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.023508739218846938,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.023508739218846938
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.049512182523962604,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.049512182523962604
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7702127659574468,
"acc_stderr": 0.02750175294441242,
"acc_norm": 0.7702127659574468,
"acc_norm_stderr": 0.02750175294441242
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6228070175438597,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.6228070175438597,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6878306878306878,
"acc_stderr": 0.02386520683697259,
"acc_norm": 0.6878306878306878,
"acc_norm_stderr": 0.02386520683697259
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8709677419354839,
"acc_stderr": 0.019070889254792753,
"acc_norm": 0.8709677419354839,
"acc_norm_stderr": 0.019070889254792753
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.02654443531270646,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.02654443531270646
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723332,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246804,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246804
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.030384169232350818,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.030384169232350818
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.024044054940440488,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.024044054940440488
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5562913907284768,
"acc_stderr": 0.04056527902281733,
"acc_norm": 0.5562913907284768,
"acc_norm_stderr": 0.04056527902281733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9247706422018349,
"acc_stderr": 0.011308662537571762,
"acc_norm": 0.9247706422018349,
"acc_norm_stderr": 0.011308662537571762
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640262,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640262
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622793,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622793
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237104,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446914,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446914
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9233716475095786,
"acc_stderr": 0.00951217069932386,
"acc_norm": 0.9233716475095786,
"acc_norm_stderr": 0.00951217069932386
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442262,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442262
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6558659217877095,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.6558659217877095,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.020823758837580916,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.020823758837580916
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8070739549839229,
"acc_stderr": 0.022411516780911366,
"acc_norm": 0.8070739549839229,
"acc_norm_stderr": 0.022411516780911366
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.01830386880689179,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.01830386880689179
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.028663820147199485,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.028663820147199485
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6108213820078227,
"acc_stderr": 0.012452613934287015,
"acc_norm": 0.6108213820078227,
"acc_norm_stderr": 0.012452613934287015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8161764705882353,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.8161764705882353,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8251633986928104,
"acc_stderr": 0.01536616706478065,
"acc_norm": 0.8251633986928104,
"acc_norm_stderr": 0.01536616706478065
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.02068718695153409,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.02068718695153409
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404417,
"mc2": 0.7198677184860963,
"mc2_stderr": 0.014465480359002823
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.7695223654283548,
"acc_stderr": 0.01160024902059582
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DBQ/Prada.Product.prices.Portugal | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Portugal - Prada - Product-level price list
tags:
- webscraping
- ecommerce
- Prada
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 1285620
num_examples: 2548
download_size: 385056
dataset_size: 1285620
---
# Prada web scraped data
## About the website
The **luxury fashion industry** in the **EMEA** (Europe, Middle East, and Africa) region is a robust and highly dynamic sector, with particular emphasis on the thriving market in **Portugal**. Best known for their high-end labels, this industry is marked by factors such as premium quality, design aesthetics, and brand reputation. **Prada**, a renowned Italian luxury fashion house, holds a significant place in this industry. The industry is fast adapting to the digital era, with notable progress in the realm of **Ecommerce**. As per the observed dataset, we have **product-list page (PLP)** data pertaining to Pradas performance in the Portugal market. The data is reflective of Prada’s online presence and Ecommerce strategies in the region.
## Link to **dataset**
[Portugal - Prada - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Prada%20Product-prices%20Portugal/r/recWitpZbmrikZDl1)
|
pierreguillou/DocLayNet-base | ---
language:
- en
- de
- fr
- ja
annotations_creators:
- crowdsourced
license: other
pretty_name: DocLayNet base
size_categories:
- 1K<n<10K
tags:
- DocLayNet
- COCO
- PDF
- IBM
- Financial-Reports
- Finance
- Manuals
- Scientific-Articles
- Science
- Laws
- Law
- Regulations
- Patents
- Government-Tenders
- object-detection
- image-segmentation
- token-classification
task_categories:
- object-detection
- image-segmentation
- token-classification
task_ids:
- instance-segmentation
---
# Dataset Card for DocLayNet base
## About this card (01/27/2023)
### Property and license
All information from this page but the content of this paragraph "About this card (01/27/2023)" has been copied/pasted from [Dataset Card for DocLayNet](https://huggingface.co/datasets/ds4sd/DocLayNet).
DocLayNet is a dataset created by Deep Search (IBM Research) published under [license CDLA-Permissive-1.0](https://huggingface.co/datasets/ds4sd/DocLayNet#licensing-information).
I do not claim any rights to the data taken from this dataset and published on this page.
### DocLayNet dataset
[DocLayNet dataset](https://github.com/DS4SD/DocLayNet) (IBM) provides page-by-page layout segmentation ground-truth using bounding-boxes for 11 distinct class labels on 80863 unique pages from 6 document categories.
Until today, the dataset can be downloaded through direct links or as a dataset from Hugging Face datasets:
- direct links: [doclaynet_core.zip](https://codait-cos-dax.s3.us.cloud-object-storage.appdomain.cloud/dax-doclaynet/1.0.0/DocLayNet_core.zip) (28 GiB), [doclaynet_extra.zip](https://codait-cos-dax.s3.us.cloud-object-storage.appdomain.cloud/dax-doclaynet/1.0.0/DocLayNet_extra.zip) (7.5 GiB)
- Hugging Face dataset library: [dataset DocLayNet](https://huggingface.co/datasets/ds4sd/DocLayNet)
Paper: [DocLayNet: A Large Human-Annotated Dataset for Document-Layout Analysis](https://arxiv.org/abs/2206.01062) (06/02/2022)
### Processing into a format facilitating its use by HF notebooks
These 2 options require the downloading of all the data (approximately 30GBi), which requires downloading time (about 45 mn in Google Colab) and a large space on the hard disk. These could limit experimentation for people with low resources.
Moreover, even when using the download via HF datasets library, it is necessary to download the EXTRA zip separately ([doclaynet_extra.zip](https://codait-cos-dax.s3.us.cloud-object-storage.appdomain.cloud/dax-doclaynet/1.0.0/DocLayNet_extra.zip), 7.5 GiB) to associate the annotated bounding boxes with the text extracted by OCR from the PDFs. This operation also requires additional code because the boundings boxes of the texts do not necessarily correspond to those annotated (a calculation of the percentage of area in common between the boundings boxes annotated and those of the texts makes it possible to make a comparison between them).
At last, in order to use Hugging Face notebooks on fine-tuning layout models like LayoutLMv3 or LiLT, DocLayNet data must be processed in a proper format.
For all these reasons, I decided to process the DocLayNet dataset:
- into 3 datasets of different sizes:
- [DocLayNet small](https://huggingface.co/datasets/pierreguillou/DocLayNet-small) (about 1% of DocLayNet) < 1.000k document images (691 train, 64 val, 49 test)
- [DocLayNet base](https://huggingface.co/datasets/pierreguillou/DocLayNet-base) (about 10% of DocLayNet) < 10.000k document images (6910 train, 648 val, 499 test)
- [DocLayNet large](https://huggingface.co/datasets/pierreguillou/DocLayNet-large) (about 100% of DocLayNet) < 100.000k document images (69.103 train, 6.480 val, 4.994 test)
- with associated texts and PDFs (base64 format),
- and in a format facilitating their use by HF notebooks.
*Note: the layout HF notebooks will greatly help participants of the IBM [ICDAR 2023 Competition on Robust Layout Segmentation in Corporate Documents](https://ds4sd.github.io/icdar23-doclaynet/)!*
### About PDFs languages
Citation of the page 3 of the [DocLayNet paper](https://arxiv.org/abs/2206.01062):
"We did not control the document selection with regard to language. **The vast majority of documents contained in DocLayNet (close to 95%) are published in English language.** However, **DocLayNet also contains a number of documents in other languages such as German (2.5%), French (1.0%) and Japanese (1.0%)**. While the document language has negligible impact on the performance of computer vision methods such as object detection and segmentation models, it might prove challenging for layout analysis methods which exploit textual features."
### About PDFs categories distribution
Citation of the page 3 of the [DocLayNet paper](https://arxiv.org/abs/2206.01062):
"The pages in DocLayNet can be grouped into **six distinct categories**, namely **Financial Reports, Manuals, Scientific Articles, Laws & Regulations, Patents and Government Tenders**. Each document category was sourced from various repositories. For example, Financial Reports contain both free-style format annual reports which expose company-specific, artistic layouts as well as the more formal SEC filings. The two largest categories (Financial Reports and Manuals) contain a large amount of free-style layouts in order to obtain maximum variability. In the other four categories, we boosted the variability by mixing documents from independent providers, such as different government websites or publishers. In Figure 2, we show the document categories contained in DocLayNet with their respective sizes."

### Download & overview
The size of the DocLayNet small is about 10% of the DocLayNet dataset (random selection respectively in the train, val and test files).
```
# !pip install -q datasets
from datasets import load_dataset
dataset_base = load_dataset("pierreguillou/DocLayNet-base")
# overview of dataset_base
DatasetDict({
train: Dataset({
features: ['id', 'texts', 'bboxes_block', 'bboxes_line', 'categories', 'image', 'pdf', 'page_hash', 'original_filename', 'page_no', 'num_pages', 'original_width', 'original_height', 'coco_width', 'coco_height', 'collection', 'doc_category'],
num_rows: 6910
})
validation: Dataset({
features: ['id', 'texts', 'bboxes_block', 'bboxes_line', 'categories', 'image', 'pdf', 'page_hash', 'original_filename', 'page_no', 'num_pages', 'original_width', 'original_height', 'coco_width', 'coco_height', 'collection', 'doc_category'],
num_rows: 648
})
test: Dataset({
features: ['id', 'texts', 'bboxes_block', 'bboxes_line', 'categories', 'image', 'pdf', 'page_hash', 'original_filename', 'page_no', 'num_pages', 'original_width', 'original_height', 'coco_width', 'coco_height', 'collection', 'doc_category'],
num_rows: 499
})
})
```
### Annotated bounding boxes
The DocLayNet base makes easy to display document image with the annotaed bounding boxes of paragraphes or lines.
Check the notebook [processing_DocLayNet_dataset_to_be_used_by_layout_models_of_HF_hub.ipynb](https://github.com/piegu/language-models/blob/master/processing_DocLayNet_dataset_to_be_used_by_layout_models_of_HF_hub.ipynb) in order to get the code.
#### Paragraphes

#### Lines

### HF notebooks
- [notebooks LayoutLM](https://github.com/NielsRogge/Transformers-Tutorials/tree/master/LayoutLM) (Niels Rogge)
- [notebooks LayoutLMv2](https://github.com/NielsRogge/Transformers-Tutorials/tree/master/LayoutLMv2) (Niels Rogge)
- [notebooks LayoutLMv3](https://github.com/NielsRogge/Transformers-Tutorials/tree/master/LayoutLMv3) (Niels Rogge)
- [notebooks LiLT](https://github.com/NielsRogge/Transformers-Tutorials/tree/master/LiLT) (Niels Rogge)
- [Document AI: Fine-tuning LiLT for document-understanding using Hugging Face Transformers](https://github.com/philschmid/document-ai-transformers/blob/main/training/lilt_funsd.ipynb) ([post](https://www.philschmid.de/fine-tuning-lilt#3-fine-tune-and-evaluate-lilt) of Phil Schmid)
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Annotations](#annotations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://developer.ibm.com/exchanges/data/all/doclaynet/
- **Repository:** https://github.com/DS4SD/DocLayNet
- **Paper:** https://doi.org/10.1145/3534678.3539043
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
DocLayNet provides page-by-page layout segmentation ground-truth using bounding-boxes for 11 distinct class labels on 80863 unique pages from 6 document categories. It provides several unique features compared to related work such as PubLayNet or DocBank:
1. *Human Annotation*: DocLayNet is hand-annotated by well-trained experts, providing a gold-standard in layout segmentation through human recognition and interpretation of each page layout
2. *Large layout variability*: DocLayNet includes diverse and complex layouts from a large variety of public sources in Finance, Science, Patents, Tenders, Law texts and Manuals
3. *Detailed label set*: DocLayNet defines 11 class labels to distinguish layout features in high detail.
4. *Redundant annotations*: A fraction of the pages in DocLayNet are double- or triple-annotated, allowing to estimate annotation uncertainty and an upper-bound of achievable prediction accuracy with ML models
5. *Pre-defined train- test- and validation-sets*: DocLayNet provides fixed sets for each to ensure proportional representation of the class-labels and avoid leakage of unique layout styles across the sets.
### Supported Tasks and Leaderboards
We are hosting a competition in ICDAR 2023 based on the DocLayNet dataset. For more information see https://ds4sd.github.io/icdar23-doclaynet/.
## Dataset Structure
### Data Fields
DocLayNet provides four types of data assets:
1. PNG images of all pages, resized to square `1025 x 1025px`
2. Bounding-box annotations in COCO format for each PNG image
3. Extra: Single-page PDF files matching each PNG image
4. Extra: JSON file matching each PDF page, which provides the digital text cells with coordinates and content
The COCO image record are defined like this example
```js
...
{
"id": 1,
"width": 1025,
"height": 1025,
"file_name": "132a855ee8b23533d8ae69af0049c038171a06ddfcac892c3c6d7e6b4091c642.png",
// Custom fields:
"doc_category": "financial_reports" // high-level document category
"collection": "ann_reports_00_04_fancy", // sub-collection name
"doc_name": "NASDAQ_FFIN_2002.pdf", // original document filename
"page_no": 9, // page number in original document
"precedence": 0, // Annotation order, non-zero in case of redundant double- or triple-annotation
},
...
```
The `doc_category` field uses one of the following constants:
```
financial_reports,
scientific_articles,
laws_and_regulations,
government_tenders,
manuals,
patents
```
### Data Splits
The dataset provides three splits
- `train`
- `val`
- `test`
## Dataset Creation
### Annotations
#### Annotation process
The labeling guideline used for training of the annotation experts are available at [DocLayNet_Labeling_Guide_Public.pdf](https://raw.githubusercontent.com/DS4SD/DocLayNet/main/assets/DocLayNet_Labeling_Guide_Public.pdf).
#### Who are the annotators?
Annotations are crowdsourced.
## Additional Information
### Dataset Curators
The dataset is curated by the [Deep Search team](https://ds4sd.github.io/) at IBM Research.
You can contact us at [deepsearch-core@zurich.ibm.com](mailto:deepsearch-core@zurich.ibm.com).
Curators:
- Christoph Auer, [@cau-git](https://github.com/cau-git)
- Michele Dolfi, [@dolfim-ibm](https://github.com/dolfim-ibm)
- Ahmed Nassar, [@nassarofficial](https://github.com/nassarofficial)
- Peter Staar, [@PeterStaar-IBM](https://github.com/PeterStaar-IBM)
### Licensing Information
License: [CDLA-Permissive-1.0](https://cdla.io/permissive-1-0/)
### Citation Information
```bib
@article{doclaynet2022,
title = {DocLayNet: A Large Human-Annotated Dataset for Document-Layout Segmentation},
doi = {10.1145/3534678.353904},
url = {https://doi.org/10.1145/3534678.3539043},
author = {Pfitzmann, Birgit and Auer, Christoph and Dolfi, Michele and Nassar, Ahmed S and Staar, Peter W J},
year = {2022},
isbn = {9781450393850},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
booktitle = {Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining},
pages = {3743–3751},
numpages = {9},
location = {Washington DC, USA},
series = {KDD '22}
}
```
### Contributions
Thanks to [@dolfim-ibm](https://github.com/dolfim-ibm), [@cau-git](https://github.com/cau-git) for adding this dataset. |
chansung/synth_ds_test5 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: generators
dtype: string
- name: prompt_id
dtype: string
- name: seed_prompts
sequence: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 2323456
num_examples: 170
download_size: 67692
dataset_size: 2323456
---
# Dataset Card for "synth_ds_test5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct | ---
pretty_name: Evaluation run of GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct](https://huggingface.co/GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T04:21:29.440361](https://huggingface.co/datasets/open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct/blob/main/results_2023-10-15T04-21-29.440361.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1923238255033557,\n\
\ \"em_stderr\": 0.0040362200154763495,\n \"f1\": 0.23113255033557045,\n\
\ \"f1_stderr\": 0.0040754338170676495,\n \"acc\": 0.27782162588792425,\n\
\ \"acc_stderr\": 0.006982598384541777\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1923238255033557,\n \"em_stderr\": 0.0040362200154763495,\n\
\ \"f1\": 0.23113255033557045,\n \"f1_stderr\": 0.0040754338170676495\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5556432517758485,\n\
\ \"acc_stderr\": 0.013965196769083555\n }\n}\n```"
repo_url: https://huggingface.co/GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|arc:challenge|25_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T04_21_29.440361
path:
- '**/details_harness|drop|3_2023-10-15T04-21-29.440361.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T04-21-29.440361.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T04_21_29.440361
path:
- '**/details_harness|gsm8k|5_2023-10-15T04-21-29.440361.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T04-21-29.440361.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hellaswag|10_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:31:16.803242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T20:31:16.803242.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T20:31:16.803242.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T04_21_29.440361
path:
- '**/details_harness|winogrande|5_2023-10-15T04-21-29.440361.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T04-21-29.440361.parquet'
- config_name: results
data_files:
- split: 2023_07_19T20_31_16.803242
path:
- results_2023-07-19T20:31:16.803242.parquet
- split: 2023_10_15T04_21_29.440361
path:
- results_2023-10-15T04-21-29.440361.parquet
- split: latest
path:
- results_2023-10-15T04-21-29.440361.parquet
---
# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct](https://huggingface.co/GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T04:21:29.440361](https://huggingface.co/datasets/open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct/blob/main/results_2023-10-15T04-21-29.440361.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1923238255033557,
"em_stderr": 0.0040362200154763495,
"f1": 0.23113255033557045,
"f1_stderr": 0.0040754338170676495,
"acc": 0.27782162588792425,
"acc_stderr": 0.006982598384541777
},
"harness|drop|3": {
"em": 0.1923238255033557,
"em_stderr": 0.0040362200154763495,
"f1": 0.23113255033557045,
"f1_stderr": 0.0040754338170676495
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5556432517758485,
"acc_stderr": 0.013965196769083555
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
irodkin/babilong_qa1_train | ---
dataset_info:
features:
- name: target
dtype: string
- name: question
dtype: string
- name: input
dtype: string
splits:
- name: 4k
num_bytes: 935839145
num_examples: 64000
- name: 1k
num_bytes: 1811186532
num_examples: 640000
- name: 1.5k
num_bytes: 3066477975
num_examples: 640000
- name: 2.5k
num_bytes: 5578948466
num_examples: 640000
- name: 8k
num_bytes: 1944047473
num_examples: 64000
download_size: 8386163635
dataset_size: 13336499591
configs:
- config_name: default
data_files:
- split: 4k
path: data/4k-*
- split: 1k
path: data/1k-*
- split: 1.5k
path: data/1.5k-*
- split: 2.5k
path: data/2.5k-*
- split: 8k
path: data/8k-*
---
# Dataset Card for "babilong_qa1_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chrystians/oasst1_pl | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 30670882
num_examples: 28320
- name: validation
num_bytes: 1414743
num_examples: 1291
download_size: 11572068
dataset_size: 32085625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Norod78/caricature-portraits-blip-captions-512 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1663841422.423
num_examples: 5619
download_size: 1662924830
dataset_size: 1663841422.423
pretty_name: Caricature portraits - Blip captions
license: cc-by-nc-sa-4.0
tags:
- text-to-image
size_categories:
- 1K<n<10K
---
# Dataset Card for "caricature-portraits-blip-captions-512"
## The 2D Caricature Dataset from [3D-CariGAN](https://github.com/qq775193759/3D-CariGAN) cropped to 512x512 and blip captioned
```
@article{ye2021caricature,
author = {Ye, Zipeng and Xia, Mengfei and Sun, Yanan and Yi, Ran and Yu, Minjing and Zhang, Juyong and Lai, Yu-Kun and Liu, Yong-Jin},
title = {3D-CariGAN: An End-to-End Solution to 3D Caricature Generation from Normal Face Photos},
journal = {IEEE Transactions on Visualization and Computer Graphics},
year = {2021},
doi={10.1109/TVCG.2021.3126659},
}
``` |
galleri5-ai/vitfashiontest | ---
license: openrail
---
|
CyberHarem/stanly_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of stanly/スタンリー/斯坦利 (Azur Lane)
This is the dataset of stanly/スタンリー/斯坦利 (Azur Lane), containing 15 images and their tags.
The core tags of this character are `long_hair, purple_eyes, pink_hair, headband, hair_between_eyes, hairband, bangs, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 14.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 9.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 34 | 17.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 12.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 34 | 21.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/stanly_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, looking_at_viewer, jacket, solo, smile, blush, single_thighhigh, necktie, white_background, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | jacket | solo | smile | blush | single_thighhigh | necktie | white_background | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:---------|:-------|:--------|:--------|:-------------------|:----------|:-------------------|:--------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
agucci/mri-sym2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: line
dtype: string
- name: rad_score
dtype: string
- name: session
dtype: int64
splits:
- name: train
num_bytes: 68961229.076
num_examples: 1476
- name: test
num_bytes: 68472028.992
num_examples: 1674
download_size: 137564710
dataset_size: 137433258.06800002
tags:
- medical
---
# Dataset Card for mri-sym2
### Dataset Summary
SymBrain, an annotated dataset of brain MRI images designed to advance the field of brain symmetry detection and segmentation.
Our dataset comprises a diverse collection of brain MRI T1w and T2w scans from the [dHCP](https://biomedia.github.io/dHCP-release-notes/download.html) dataset.
Each annotated to highlight the ideal **straight** mid-sagittal plane (MSP), demarcating the brain into two symmetrical hemispheres.
The accurate extraction of the MSP has the potential to greatly enhance segmentation precision.
Researchers and practitioners can utilize this dataset to devise innovative methods for enhanced brain MRI image segmentation.
SymBrain's rich and extensive content empowers the research community to address complex challenges in neuroimaging analysis,
ultimately contributing to advancements in medical diagnostics and treatment planning.
Symmetry analysis plays an important role in medical image processing, particularly in the detection of diseases and malformations.
SymBrain leverages the inherent bilateral symmetry observed in brain MRI images,
making it an invaluable resource for the development and evaluation
of automated algorithms aimed at detecting the symmetry axis within brain MRI data.
## Dataset Structure
The dataset contains 1476 T1w images types and 1674 T2w images.
The differences between the modalities lie in the intensity variations of the different brain areas.
All the images are accessible in the 'train' part of the dataset.
## Dataset Creation
### Loading the data
The dataset contains a 'train' split of 1476 rows, containing the t1 type images, and a 'test' split of 1674 rows, with the t2 type images.
```python
dataset = load_dataset("agucci/mri-sym2")
# first dataset example selection:
dataset['train'][0]
```
**Attributes :**
- *image:* PIL image, shape (290, 290)
- *line:* Straight line annotation coordinates on the image. ({'x':x1, 'y':y1}, {'x':x2, 'y':y2}). Where (x1,y1), (x2,y2) are the starting and end points of the line.
- *radscore:* Radiology score of the volume the image was extracted from. Please refer to [dHCP doc](https://biomedia.github.io/dHCP-release-notes/download.html#metadata) for scores explanation.
- *session:* Session-ID of the original dataset, used for scan retrieval.
### Source Data
[dHCP](https://biomedia.github.io/dHCP-release-notes/download.html) dataset.
Three slices have been extracted from each of the 1050 3D volumes, creating 3150 images.
### Annotations
The authors did Annotations manually with the [V7lab tools](https://www.v7labs.com/).
### Licensing Information
mit
### Citation Information
When using the data please cite :
```bibtext
@misc{gucciardi2024symbrain,
title={Symbrain: A large-scale dataset of MRI images for neonatal brain symmetry analysis},
author={Arnaud Gucciardi and Safouane El Ghazouali and Francesca Venturini and Vida Groznik and Umberto Michelucci},
year={2024},
eprint={2401.11814},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
and
**dhcp dataset**
Data were provided by the developing Human Connectome Project, KCL-Imperial-
Oxford Consortium funded by the European Research Council under the Eu-
ropean Union Seventh Framework Programme (FP/2007-2013) / ERC Grant
Agreement no. [319456]. We are grateful to the families who generously sup-
ported this trial. |
anan-2024/twitter_dataset_1713126987 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 25514
num_examples: 60
download_size: 15309
dataset_size: 25514
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mtc/full_cleaned_xsum_faith_all_gpt-3-5_explanations | ---
dataset_info:
features:
- name: document
dtype: string
- name: claim
dtype: string
- name: bbcid
dtype: string
- name: model_name
dtype: string
- name: label
dtype: string
- name: split
dtype: string
- name: annotations
sequence: string
- name: explanation
dtype: string
splits:
- name: train
num_bytes: 2820056
num_examples: 1048
- name: validation
num_bytes: 451854
num_examples: 200
- name: test
num_bytes: 2917395
num_examples: 1247
download_size: 2253365
dataset_size: 6189305
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Atipico1/NQ | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 292966084
num_examples: 87925
- name: test
num_bytes: 12097860
num_examples: 3610
download_size: 180955386
dataset_size: 305063944
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
NetherlandsForensicInstitute/msmarco-translated-nl | ---
viewer: true
task_categories:
- sentence-similarity
language:
- nl
size_categories:
- 1M<n<10M
---
This is a Dutch version of the [MS MARCO](https://microsoft.github.io/msmarco/) dataset. Which we have auto-translated from English into Dutch using Meta's [No Language Left Behind](https://ai.facebook.com/research/no-language-left-behind/) model, specifically the [huggingface implementation](https://huggingface.co/facebook/nllb-200-distilled-600M). |
SaranaAbidueva/buryat-russian_parallel_corpus | ---
dataset_info:
features:
- name: bxr
dtype: string
- name: ru
dtype: string
- name: corpus
dtype: string
splits:
- name: train
num_bytes: 8989074
num_examples: 38260
download_size: 4394110
dataset_size: 8989074
license: cc-by-4.0
language:
- ru
task_categories:
- translation
size_categories:
- 10K<n<100K
---
# Dataset Card for "buryat-russian_parallel_corpus"
Датасет состоит из 38260 пар на русском и бурятском языках. Из них 19411 пар предложений и 20058 пар слов. <br/>
Статистика по источникам: <br/>
<br/>
библия 7519 <br/>
книги 5250 <br/>
татоеба 807 <br/>
стихи 471 <br/>
стихи Нимбуев 1210 <br/>
словарь 20058 <br/>
википедия 1882 <br/>
законы 1063 <br/>
<br/>
The dataset consists of 38260 pairs in Russian and Buryat languages. Of these, 19411 pairs of sentences and 20058 pairs of words. <br/>
Source stats: <br/>
<br/>
bible 7519 <br/>
books 5250 <br/>
tatoeba 807 <br/>
poems 471 <br/>
poems Nimbuev 1210 <br/>
dictionary 20058 <br/>
wikipedia 1882 <br/>
laws 1063 <br/>
<br/>
<br/>
@inproceedings{<br/>
title={Buryat-Russian parallel corpus},<br/>
author={Sarana Abidueva, Dari Baturova},<br/>
year={2023}<br/>
} |
Falah/iraqi_image_caption | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 10313555.0
num_examples: 147
download_size: 9615727
dataset_size: 10313555.0
---
# Dataset Card for "iraqi_image_caption"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ogirimah/ask_herts | ---
license: apache-2.0
---
|
Yankz/tr_dataset-formatted | ---
dataset_info:
features:
- name: Correct
dtype: string
- name: Wrong
dtype: string
splits:
- name: train
num_bytes: 600279600.4235923
num_examples: 83740
- name: validation
num_bytes: 173206228
num_examples: 24298
- name: test
num_bytes: 173753059
num_examples: 24299
download_size: 477205502
dataset_size: 947238887.4235923
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
one-sec-cv12/chunk_273 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 5506047648.25
num_examples: 57326
download_size: 4808654570
dataset_size: 5506047648.25
---
# Dataset Card for "chunk_273"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Samee-ur/guanaco-100 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 189498
num_examples: 100
download_size: 114615
dataset_size: 189498
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nolan1206/Nolan_whisper_educate1 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 82981.0
num_examples: 3
- name: test
num_bytes: 82981.0
num_examples: 3
download_size: 84786
dataset_size: 165962.0
---
# Dataset Card for "Nolan_whisper_educate1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KimuGenie/KLUE_mrc_negative_train | ---
dataset_info:
features:
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: id
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: document_id
dtype: int64
- name: hard_negative_text
sequence: string
- name: hard_negative_document_id
sequence: int64
- name: hard_negative_title
sequence: string
splits:
- name: train
num_bytes: 205021808
num_examples: 3952
- name: validation
num_bytes: 12329366
num_examples: 240
download_size: 124133126
dataset_size: 217351174
license: cc-by-4.0
task_categories:
- question-answering
language:
- ko
---
# Dataset Card for "KLUE_mrc_negative_train"
KLUE mrc train dataset에 BM25을 이용해서 question에 대한 hard negative text 20개를 추가한 데이터입니다.
BM25로 hard negative text를 찾았고, preprocessing을 통해 중복 데이터를 최대한 삭제했습니다.
사용한 BM25의 정보는 아래와 같습니다.
|top-k|top-10|top-20|top-50|top-100|
|-|-|-|-|-|
|accuracy(%)|92.1|95.0|97.1|98.8|
# Citation
```
@misc{park2021klue,
title={KLUE: Korean Language Understanding Evaluation},
author={Sungjoon Park and Jihyung Moon and Sungdong Kim and Won Ik Cho and Jiyoon Han and Jangwon Park and Chisung Song and Junseong Kim and Yongsook Song and Taehwan Oh and Joohong Lee and Juhyun Oh and Sungwon Lyu and Younghoon Jeong and Inkwon Lee and Sangwoo Seo and Dongjun Lee and Hyunwoo Kim and Myeonghwa Lee and Seongbo Jang and Seungwon Do and Sunkyoung Kim and Kyungtae Lim and Jongwon Lee and Kyumin Park and Jamin Shin and Seonghyun Kim and Lucy Park and Alice Oh and Jungwoo Ha and Kyunghyun Cho},
year={2021},
eprint={2105.09680},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora | ---
pretty_name: Evaluation run of Aspik101/llama-30b-instruct-2048-PL-lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aspik101/llama-30b-instruct-2048-PL-lora](https://huggingface.co/Aspik101/llama-30b-instruct-2048-PL-lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T18:18:54.514203](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora/blob/main/results_2023-10-17T18-18-54.514203.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006082214765100671,\n\
\ \"em_stderr\": 0.0007962432393028902,\n \"f1\": 0.08940016778523495,\n\
\ \"f1_stderr\": 0.0017994699002823276,\n \"acc\": 0.4795769199239097,\n\
\ \"acc_stderr\": 0.010868072702003555\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.006082214765100671,\n \"em_stderr\": 0.0007962432393028902,\n\
\ \"f1\": 0.08940016778523495,\n \"f1_stderr\": 0.0017994699002823276\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1683093252463988,\n \
\ \"acc_stderr\": 0.010305695358125522\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881588\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Aspik101/llama-30b-instruct-2048-PL-lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|arc:challenge|25_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T18_18_54.514203
path:
- '**/details_harness|drop|3_2023-10-17T18-18-54.514203.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T18-18-54.514203.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T18_18_54.514203
path:
- '**/details_harness|gsm8k|5_2023-10-17T18-18-54.514203.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T18-18-54.514203.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hellaswag|10_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:31:39.765804.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T12_31_39.765804
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T12:31:39.765804.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T12:31:39.765804.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T18_18_54.514203
path:
- '**/details_harness|winogrande|5_2023-10-17T18-18-54.514203.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T18-18-54.514203.parquet'
- config_name: results
data_files:
- split: 2023_10_17T18_18_54.514203
path:
- results_2023-10-17T18-18-54.514203.parquet
- split: latest
path:
- results_2023-10-17T18-18-54.514203.parquet
---
# Dataset Card for Evaluation run of Aspik101/llama-30b-instruct-2048-PL-lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aspik101/llama-30b-instruct-2048-PL-lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aspik101/llama-30b-instruct-2048-PL-lora](https://huggingface.co/Aspik101/llama-30b-instruct-2048-PL-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T18:18:54.514203](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__llama-30b-instruct-2048-PL-lora/blob/main/results_2023-10-17T18-18-54.514203.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006082214765100671,
"em_stderr": 0.0007962432393028902,
"f1": 0.08940016778523495,
"f1_stderr": 0.0017994699002823276,
"acc": 0.4795769199239097,
"acc_stderr": 0.010868072702003555
},
"harness|drop|3": {
"em": 0.006082214765100671,
"em_stderr": 0.0007962432393028902,
"f1": 0.08940016778523495,
"f1_stderr": 0.0017994699002823276
},
"harness|gsm8k|5": {
"acc": 0.1683093252463988,
"acc_stderr": 0.010305695358125522
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881588
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
linhqyy/data_test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 9016799768.976877
num_examples: 64418
- name: test
num_bytes: 1001925014.6631207
num_examples: 7158
download_size: 10000087591
dataset_size: 10018724783.639997
---
# Dataset Card for "data_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ta_class_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ta_class/戦艦タ級 (Kantai Collection)
This is the dataset of ta_class/戦艦タ級 (Kantai Collection), containing 91 images and their tags.
The core tags of this character are `long_hair, grey_hair, pale_skin, breasts, white_hair, yellow_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 91 | 84.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ta_class_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 91 | 55.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ta_class_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 164 | 97.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ta_class_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 91 | 76.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ta_class_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 164 | 128.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ta_class_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ta_class_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 49 |  |  |  |  |  | abyssal_ship, 1girl, serafuku, solo, smile, black_panties, looking_at_viewer, navel, thighhighs, green_eyes, cape, glowing_eyes, no_pants |
| 1 | 7 |  |  |  |  |  | 1girl, abyssal_ship, neckerchief, sailor_shirt, serafuku, solo, hair_between_eyes, looking_at_viewer, shiny, short_sleeves, upper_body, blush, blue_sailor_collar, medium_breasts, open_mouth, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | abyssal_ship | 1girl | serafuku | solo | smile | black_panties | looking_at_viewer | navel | thighhighs | green_eyes | cape | glowing_eyes | no_pants | neckerchief | sailor_shirt | hair_between_eyes | shiny | short_sleeves | upper_body | blush | blue_sailor_collar | medium_breasts | open_mouth | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:-----------|:-------|:--------|:----------------|:--------------------|:--------|:-------------|:-------------|:-------|:---------------|:-----------|:--------------|:---------------|:--------------------|:--------|:----------------|:-------------|:--------|:---------------------|:-----------------|:-------------|:--------------------|:-------------------|
| 0 | 49 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Dans-DiscountModels__TinyLlama-1.1B-FFT-Test2 | ---
pretty_name: Evaluation run of Dans-DiscountModels/TinyLlama-1.1B-FFT-Test2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Dans-DiscountModels/TinyLlama-1.1B-FFT-Test2](https://huggingface.co/Dans-DiscountModels/TinyLlama-1.1B-FFT-Test2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dans-DiscountModels__TinyLlama-1.1B-FFT-Test2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T02:54:33.981160](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__TinyLlama-1.1B-FFT-Test2/blob/main/results_2023-12-18T02-54-33.981160.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2607212816484538,\n\
\ \"acc_stderr\": 0.030867007452377485,\n \"acc_norm\": 0.26236232884330074,\n\
\ \"acc_norm_stderr\": 0.03165482907989167,\n \"mc1\": 0.20563035495716034,\n\
\ \"mc1_stderr\": 0.014148482219460972,\n \"mc2\": 0.3631613697388407,\n\
\ \"mc2_stderr\": 0.013778296923024373\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.31399317406143346,\n \"acc_stderr\": 0.013562691224726291,\n\
\ \"acc_norm\": 0.34215017064846415,\n \"acc_norm_stderr\": 0.013864152159177278\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44054969129655447,\n\
\ \"acc_stderr\": 0.004954384702021655,\n \"acc_norm\": 0.5795658235411273,\n\
\ \"acc_norm_stderr\": 0.004926198483948714\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.03302789859901717,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.03302789859901717\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882921,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882921\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724077,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724077\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.03214737302029469,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.03214737302029469\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.0285048564705142,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.0285048564705142\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.034165204477475494,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.034165204477475494\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20634920634920634,\n \"acc_stderr\": 0.020842290930114683,\n \"\
acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.020842290930114683\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n\
\ \"acc_stderr\": 0.02354079935872329,\n \"acc_norm\": 0.21935483870967742,\n\
\ \"acc_norm_stderr\": 0.02354079935872329\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.02874898368994107,\n\
\ \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.02874898368994107\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386396,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386396\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178263,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178263\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.02176373368417392,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.02176373368417392\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868963,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868963\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21834862385321102,\n \"acc_stderr\": 0.01771260052872273,\n \"\
acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.01771260052872273\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.21568627450980393,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484256,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.03586594738573973,\n\
\ \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.03586594738573973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.029343114798094472,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.029343114798094472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\
\ \"acc_stderr\": 0.015464676163395977,\n \"acc_norm\": 0.24904214559386972,\n\
\ \"acc_norm_stderr\": 0.015464676163395977\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.014487500852850414,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.014487500852850414\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632935,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632935\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432414,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432414\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142765,\n \
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142765\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.15510204081632653,\n \"acc_stderr\": 0.0231747988612186,\n\
\ \"acc_norm\": 0.15510204081632653,\n \"acc_norm_stderr\": 0.0231747988612186\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n\
\ \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n\
\ \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20563035495716034,\n\
\ \"mc1_stderr\": 0.014148482219460972,\n \"mc2\": 0.3631613697388407,\n\
\ \"mc2_stderr\": 0.013778296923024373\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5880031570639306,\n \"acc_stderr\": 0.013833112857645935\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401502019\n }\n}\n```"
repo_url: https://huggingface.co/Dans-DiscountModels/TinyLlama-1.1B-FFT-Test2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|arc:challenge|25_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|gsm8k|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hellaswag|10_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T02-54-33.981160.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T02-54-33.981160.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- '**/details_harness|winogrande|5_2023-12-18T02-54-33.981160.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T02-54-33.981160.parquet'
- config_name: results
data_files:
- split: 2023_12_18T02_54_33.981160
path:
- results_2023-12-18T02-54-33.981160.parquet
- split: latest
path:
- results_2023-12-18T02-54-33.981160.parquet
---
# Dataset Card for Evaluation run of Dans-DiscountModels/TinyLlama-1.1B-FFT-Test2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Dans-DiscountModels/TinyLlama-1.1B-FFT-Test2](https://huggingface.co/Dans-DiscountModels/TinyLlama-1.1B-FFT-Test2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Dans-DiscountModels__TinyLlama-1.1B-FFT-Test2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T02:54:33.981160](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__TinyLlama-1.1B-FFT-Test2/blob/main/results_2023-12-18T02-54-33.981160.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2607212816484538,
"acc_stderr": 0.030867007452377485,
"acc_norm": 0.26236232884330074,
"acc_norm_stderr": 0.03165482907989167,
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460972,
"mc2": 0.3631613697388407,
"mc2_stderr": 0.013778296923024373
},
"harness|arc:challenge|25": {
"acc": 0.31399317406143346,
"acc_stderr": 0.013562691224726291,
"acc_norm": 0.34215017064846415,
"acc_norm_stderr": 0.013864152159177278
},
"harness|hellaswag|10": {
"acc": 0.44054969129655447,
"acc_stderr": 0.004954384702021655,
"acc_norm": 0.5795658235411273,
"acc_norm_stderr": 0.004926198483948714
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.03302789859901717,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.03302789859901717
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882921,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882921
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724077,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724077
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.03214737302029469,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.03214737302029469
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.0285048564705142,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.0285048564705142
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.020842290930114683,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.020842290930114683
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.02874898368994107,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.02874898368994107
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386396,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386396
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178263,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178263
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02176373368417392,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02176373368417392
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959912,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959912
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868963,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868963
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.01771260052872273,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.01771260052872273
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484256,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.1553398058252427,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.1553398058252427,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094472,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24904214559386972,
"acc_stderr": 0.015464676163395977,
"acc_norm": 0.24904214559386972,
"acc_norm_stderr": 0.015464676163395977
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850414,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850414
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632935,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632935
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432414,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.01740181671142765,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.01740181671142765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.15510204081632653,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.15510204081632653,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460972,
"mc2": 0.3631613697388407,
"mc2_stderr": 0.013778296923024373
},
"harness|winogrande|5": {
"acc": 0.5880031570639306,
"acc_stderr": 0.013833112857645935
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401502019
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-one-sec-cv12-each-chunk-uniq/chunk_95 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1258894996.0
num_examples: 245303
download_size: 1289377418
dataset_size: 1258894996.0
---
# Dataset Card for "chunk_95"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
opennyaiorg/aibe_dataset | ---
language:
- en
license: cc-by-nd-4.0
size_categories:
- 1K<n<10K
task_categories:
- question-answering
pretty_name: AIBE Dataset
tags:
- legal
dataset_info:
features:
- name: exam_name
dtype: string
- name: exam_number
dtype: string
- name: question_number
dtype: int64
- name: question_text
dtype: string
- name: options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 297975
num_examples: 1157
download_size: 184347
dataset_size: 297975
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# All India Bar Exam (AIBE) Dataset
<!-- Provide a quick summary of the dataset. -->
Welcome to the All India Bar Exam (AIBE) Dataset! This dataset is designed to facilitate the evaluation of Natural Language Processing (NLP) models, particularly in the field of legal studies. It contains a collection of questions from various AIBE exams, along with multiple-choice options and correct answers.
## Purpose
The primary purpose of this dataset is to serve as a valuable resource for researchers, developers, and practitioners working on legal NLP and large language models. By using this dataset, you can evaluate the performance of your models in understanding and processing legal questions, which can be crucial for applications such as legal document analysis, legal chatbots, and more.
### Content
The dataset includes questions from past 12 years AIBE exams ranging from AIBE 4 to AIBE 16.
* A collection of questions from different All India Bar Exams.
* Multiple-choice options are associated with each question.
* The correct answer for each question.
### Intended Use
This dataset is intended for evaluation purposes only. Please refrain from altering the dataset to maintain its integrity and ensure fair evaluations. Users are encouraged to use the dataset responsibly, respecting legal and ethical guidelines.
<!--
### Citation
If you use this dataset in your work or research, please cite it appropriately to acknowledge the source:
```bibtex
[Author Name(s), TO BE UPDATED]
```
-->
### Usage Guidelines
* Evaluation Only: Use this dataset solely for evaluating the performance of NLP models on legal questions.
* No Alterations: Do not alter the dataset. Any modifications may compromise the integrity of the evaluation.
* Attribution: If you use this dataset in your research or projects, kindly attribute it as specified in the citation section.
### Disclaimer
The dataset is provided "as is" without any warranty. The authors and contributors are not responsible for any errors or omissions in the dataset. Use it at your own discretion.
### Feedback
We welcome feedback, suggestions, or any issues you may encounter with the dataset. Feel free to contribute to its improvement by providing feedback on the Hugging Face platform or by contacting the dataset maintainers.
Thank you for using the All India Bar Exam (AIBE) Dataset! We hope it proves to be a valuable asset for your research and development in the field of legal NLP. |
yangyz1230/H3K36me3 | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 508421
num_examples: 901
- name: test
num_bytes: 60916
num_examples: 108
download_size: 275101
dataset_size: 569337
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.