datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
BeIR/nfcorpus-qrels | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
SocialGrep/one-year-of-tsla-on-reddit | ---
annotations_creators:
- lexyr
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
---
# Dataset Card for one-year-of-tsla-on-reddit
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Homepage:** [https://socialgrep.com/datasets](https://socialgrep.com/datasets/one-year-of-tsla-on-reddit?utm_source=huggingface&utm_medium=link&utm_campaign=oneyearoftslaonreddit)
- **Reddit downloader used:** [https://socialgrep.com/exports](https://socialgrep.com/exports?utm_source=huggingface&utm_medium=link&utm_campaign=oneyearoftslaonreddit)
- **Point of Contact:** [Website](https://socialgrep.com/contact?utm_source=huggingface&utm_medium=link&utm_campaign=oneyearoftslaonreddit)
### Dataset Summary
A year's worth of mentions of Tesla Inc. (TSLA) in Reddit posts and comments.
### Languages
Mainly English.
## Dataset Structure
### Data Instances
A data point is a post or a comment. Due to the separate nature of the two, those exist in two different files - even though many fields are shared.
### Data Fields
- 'type': the type of the data point. Can be 'post' or 'comment'.
- 'id': the base-36 Reddit ID of the data point. Unique when combined with type.
- 'subreddit.id': the base-36 Reddit ID of the data point's host subreddit. Unique.
- 'subreddit.name': the human-readable name of the data point's host subreddit.
- 'subreddit.nsfw': a boolean marking the data point's host subreddit as NSFW or not.
- 'created_utc': a UTC timestamp for the data point.
- 'permalink': a reference link to the data point on Reddit.
- 'score': score of the data point on Reddit.
- 'domain': (Post only) the domain of the data point's link.
- 'url': (Post only) the destination of the data point's link, if any.
- 'selftext': (Post only) the self-text of the data point, if any.
- 'title': (Post only) the title of the post data point.
- 'body': (Comment only) the body of the comment data point.
- 'sentiment': (Comment only) the result of an in-house sentiment analysis pipeline. Used for exploratory analysis.
## Additional Information
### Licensing Information
CC-BY v4.0
|
ummagumm-a/reddit_posts | ---
dataset_info:
features:
- name: post_id
dtype: string
- name: post_title
dtype: string
- name: post_body
dtype: string
- name: subreddit
dtype: string
- name: post_url
dtype: string
- name: flair_text
dtype: string
- name: score
dtype: int64
- name: comments
dtype: int64
- name: upvote_ratio
dtype: float64
- name: date-time
dtype: string
splits:
- name: train
num_bytes: 97334
num_examples: 320
- name: test
num_bytes: 27972
num_examples: 80
download_size: 81893
dataset_size: 125306
---
# Dataset Card for "reddit_posts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pqai/PoC | ---
license: mit
---
# PoC (Patents with One Citation) dataset
This dataset is useful for training or evaluating models that predict patent-to-patent similarity, such as those used for patent searching.
It was developed and used for the training of an ML model that powers the [PQAI](https://search.projectpq.ai/) search engine.
## Details
The dataset contains 90,013 samples.
Each sample contains:
- a subject patent (`sp`)
- its only citation (`cit`)
- its CPC code (`cpc`)
- a list of 10 patents (`sims`) that are similar to `sp` (in that they share the CPC code) and published before `sp`
Every line of the dataset is a JSON parsable string (`.jsonl` format), which upon parsing given an array of this format:
```
[pn, cit, cpc, [...sims]]
```
## Task
Given the subject patent `sp` the task is to assign a similarity score to each patent `[cit, ...sims]`. Ideally, the score should be maximum for `cit`.
## Metrics
It's a ranking task, so the following metrics make the most sense:
- DCG/NDCG
- Accuracy |
chaenykim/supernova-timeseries | ---
license: mit
dataset_info:
features:
- name: objid
dtype: int32
- name: times_wv
dtype:
array2_d:
shape:
- 300
- 2
dtype: float64
- name: target
dtype:
array2_d:
shape:
- 300
- 2
dtype: float64
- name: label
dtype:
class_label:
names:
'0': $\mu$-Lens-Single
'1': TDE
'2': EB
'3': SNII
'4': SNIax
'5': Mira
'6': SNIbc
'7': KN
'8': M-dwarf
'9': SNIa-91bg
'10': AGN
'11': SNIa
'12': RRL
'13': SLSN-I
'14': extra
- name: redshift
dtype: float32
splits:
- name: train
num_bytes: 75438576
num_examples: 6274
- name: validation
num_bytes: 9402768
num_examples: 782
- name: test
num_bytes: 9523008
num_examples: 792
download_size: 33374835
dataset_size: 94364352
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
rushai-dev/name_gen | ---
license: apache-2.0
---
|
CVasNLPExperiments/Hatefulmemes_validation_google_flan_t5_xxl_mode_C_T_OCR_rices_ns_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 371466
num_examples: 500
download_size: 59368
dataset_size: 371466
---
# Dataset Card for "Hatefulmemes_validation_google_flan_t5_xxl_mode_C_T_OCR_rices_ns_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andersonbcdefg/st_specter_train_triples | ---
license: mit
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 158287267
num_examples: 684100
download_size: 65955356
dataset_size: 158287267
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_stsb_who_what | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 23743
num_examples: 108
- name: test
num_bytes: 11004
num_examples: 50
- name: train
num_bytes: 48342
num_examples: 185
download_size: 66700
dataset_size: 83089
---
# Dataset Card for "MULTI_VALUE_stsb_who_what"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hsp9308/korquad_aihub | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 156721973
num_examples: 122091
- name: validation
num_bytes: 16406685
num_examples: 12628
download_size: 64923300
dataset_size: 173128658
---
# Dataset Card for "korquad_aihub"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kirito8963/khoabatocomgenow | ---
license: apache-2.0
---
|
hlt-lab/personachatsample-repeat_itself | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
- name: reference
dtype: string
splits:
- name: train
num_bytes: 35611
num_examples: 100
download_size: 27419
dataset_size: 35611
---
# Dataset Card for "personachatsample-repeat_itself"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlppAI/SlimPajama-chunked | ---
task_categories:
- text-generation
language:
- en
pretty_name: SlimPajama-Chunked
---
# SlimPajama-Chunked
## Dataset Description
This is a chunked re-upload of Cerebras' [SlimPajama-627B](https://huggingface.co/datasets/cerebras/SlimPajama-627B). The original upload has split
the dataset into 10 chunks, with each containing upwards of 5,000 files. This makes it cumbersome to download and process. We've downloaded the entire
dataset for our own purposes, and decided to upload the chunked version for easier usage.
Each file is ~45GB due to HuggingFace's limitation of 50GB per LFS file. |
systemk/origami-data | ---
dataset_info:
- config_name: v0.2
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1665085111.8754308
num_examples: 359366
download_size: 3133404413
dataset_size: 1665085111.8754308
- config_name: v0.2-full
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1433415294.4921181
num_examples: 309366
download_size: 2700841746
dataset_size: 1433415294.4921181
- config_name: v0.2-lora
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 231669817.38331267
num_examples: 50000
download_size: 432797937
dataset_size: 231669817.38331267
- config_name: v0.2:lora
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 231669817.38331267
num_examples: 50000
download_size: 432797937
dataset_size: 231669817.38331267
- config_name: v0.3
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2487928047.6435037
num_examples: 518732
download_size: 4575030520
dataset_size: 2487928047.6435037
- config_name: v0.3-full
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2008310818.0059445
num_examples: 418732
download_size: 3707433739
dataset_size: 2008310818.0059445
- config_name: v0.3-lora
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 479617229.63755924
num_examples: 100000
download_size: 867658472
dataset_size: 479617229.63755924
configs:
- config_name: v0.2
data_files:
- split: train
path: v0.2/train-*
- config_name: v0.2-full
data_files:
- split: train
path: v0.2-full/train-*
- config_name: v0.2-lora
data_files:
- split: train
path: v0.2-lora/train-*
- config_name: v0.3
data_files:
- split: train
path: v0.3/train-*
- config_name: v0.3-full
data_files:
- split: train
path: v0.3-full/train-*
- config_name: v0.3-lora
data_files:
- split: train
path: v0.3-lora/train-*
---
|
csebuetnlp/BanglaParaphrase | ---
annotations_creators:
- found
language_creators:
- found
language:
- bn
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 100k<n<1M
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
pretty_name: BanglaParaphrase
tags:
- conditional-text-generation
- paraphrase-generation
---
# Dataset Card for "BanglaParaphrase"
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [https://github.com/csebuetnlp/banglaparaphrase](https://github.com/csebuetnlp/banglaparaphrase)
- **Paper:** [BanglaParaphrase: A High-Quality Bangla Paraphrase Dataset](https://arxiv.org/abs/2210.05109)
- **Point of Contact:** [Najrin Sultana](mailto:nazrinshukti@gmail.com)
### Dataset Summary
We present BanglaParaphrase, a high quality synthetic Bangla paraphrase dataset containing about 466k paraphrase pairs.
The paraphrases ensures high quality by being semantically coherent and syntactically diverse.
### Supported Tasks and Leaderboards
[More information needed](https://github.com/csebuetnlp/banglaparaphrase)
### Languages
- `bengali`
## Loading the dataset
```python
from datasets import load_dataset
from datasets import load_dataset
ds = load_dataset("csebuetnlp/BanglaParaphrase")
```
## Dataset Structure
### Data Instances
One example from the `train` part of the dataset is given below in JSON format.
```
{
"source": "বেশিরভাগ সময় প্রকৃতির দয়ার ওপরেই বেঁচে থাকতেন উপজাতিরা।",
"target": "বেশিরভাগ সময়ই উপজাতিরা প্রকৃতির দয়ার উপর নির্ভরশীল ছিল।"
}
```
### Data Fields
- 'source': A string representing the source sentence.
- 'target': A string representing the target sentence.
### Data Splits
Dataset with train-dev-test example counts are given below:
Language | ISO 639-1 Code | Train | Validation | Test |
-------------- | ---------------- | ------- | ----- | ------ |
Bengali | bn | 419, 967 | 233, 31 | 233, 32 |
## Dataset Creation
### Curation Rationale
[More information needed](https://github.com/csebuetnlp/banglaparaphrase)
### Source Data
[Roar Bangla](https://roar.media/bangla)
#### Initial Data Collection and Normalization
[Detailed in the paper](https://arxiv.org/abs/2210.05109)
#### Who are the source language producers?
[Detailed in the paper](https://arxiv.org/abs/2210.05109)
### Annotations
[Detailed in the paper](https://arxiv.org/abs/2210.05109)
#### Annotation process
[Detailed in the paper](https://arxiv.org/abs/2210.05109)
#### Who are the annotators?
[Detailed in the paper](https://arxiv.org/abs/2210.05109)
### Personal and Sensitive Information
[More information needed](https://github.com/csebuetnlp/banglaparaphrase)
## Considerations for Using the Data
### Social Impact of Dataset
[More information needed](https://github.com/csebuetnlp/banglaparaphrase)
### Discussion of Biases
[More information needed](https://github.com/csebuetnlp/banglaparaphrase)
### Other Known Limitations
[More information needed](https://github.com/csebuetnlp/banglaparaphrase)
## Additional Information
### Dataset Curators
[More information needed](https://github.com/csebuetnlp/banglaparaphrase)
### Licensing Information
Contents of this repository are restricted to only non-commercial research purposes under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/). Copyright of the dataset contents belongs to the original copyright holders.
### Citation Information
```
@article{akil2022banglaparaphrase,
title={BanglaParaphrase: A High-Quality Bangla Paraphrase Dataset},
author={Akil, Ajwad and Sultana, Najrin and Bhattacharjee, Abhik and Shahriyar, Rifat},
journal={arXiv preprint arXiv:2210.05109},
year={2022}
}
```
### Contributions
|
p1atdev/FractalDB-1k | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '00000'
'1': '00001'
'2': '00002'
'3': '00003'
'4': '00004'
'5': '00005'
'6': '00006'
'7': '00007'
'8': 8
'9': 9
'10': '00010'
'11': '00011'
'12': '00012'
'13': '00013'
'14': '00014'
'15': '00015'
'16': '00016'
'17': '00017'
'18': 18
'19': 19
'20': '00020'
'21': '00021'
'22': '00022'
'23': '00023'
'24': '00024'
'25': '00025'
'26': '00026'
'27': '00027'
'28': 28
'29': 29
'30': '00030'
'31': '00031'
'32': '00032'
'33': '00033'
'34': '00034'
'35': '00035'
'36': '00036'
'37': '00037'
'38': 38
'39': 39
'40': '00040'
'41': '00041'
'42': '00042'
'43': '00043'
'44': '00044'
'45': '00045'
'46': '00046'
'47': '00047'
'48': 48
'49': 49
'50': '00050'
'51': '00051'
'52': '00052'
'53': '00053'
'54': '00054'
'55': '00055'
'56': '00056'
'57': '00057'
'58': 58
'59': 59
'60': '00060'
'61': '00061'
'62': '00062'
'63': '00063'
'64': '00064'
'65': '00065'
'66': '00066'
'67': '00067'
'68': 68
'69': 69
'70': '00070'
'71': '00071'
'72': '00072'
'73': '00073'
'74': '00074'
'75': '00075'
'76': '00076'
'77': '00077'
'78': 78
'79': 79
'80': 80
'81': 81
'82': 82
'83': 83
'84': 84
'85': 85
'86': 86
'87': 87
'88': 88
'89': 89
'90': 90
'91': 91
'92': 92
'93': 93
'94': 94
'95': 95
'96': 96
'97': 97
'98': 98
'99': 99
'100': '00100'
'101': '00101'
'102': '00102'
'103': '00103'
'104': '00104'
'105': '00105'
'106': '00106'
'107': '00107'
'108': 108
'109': 109
'110': '00110'
'111': '00111'
'112': '00112'
'113': '00113'
'114': '00114'
'115': '00115'
'116': '00116'
'117': '00117'
'118': 118
'119': 119
'120': '00120'
'121': '00121'
'122': '00122'
'123': '00123'
'124': '00124'
'125': '00125'
'126': '00126'
'127': '00127'
'128': 128
'129': 129
'130': '00130'
'131': '00131'
'132': '00132'
'133': '00133'
'134': '00134'
'135': '00135'
'136': '00136'
'137': '00137'
'138': 138
'139': 139
'140': '00140'
'141': '00141'
'142': '00142'
'143': '00143'
'144': '00144'
'145': '00145'
'146': '00146'
'147': '00147'
'148': 148
'149': 149
'150': '00150'
'151': '00151'
'152': '00152'
'153': '00153'
'154': '00154'
'155': '00155'
'156': '00156'
'157': '00157'
'158': 158
'159': 159
'160': '00160'
'161': '00161'
'162': '00162'
'163': '00163'
'164': '00164'
'165': '00165'
'166': '00166'
'167': '00167'
'168': 168
'169': 169
'170': '00170'
'171': '00171'
'172': '00172'
'173': '00173'
'174': '00174'
'175': '00175'
'176': '00176'
'177': '00177'
'178': 178
'179': 179
'180': 180
'181': 181
'182': 182
'183': 183
'184': 184
'185': 185
'186': 186
'187': 187
'188': 188
'189': 189
'190': 190
'191': 191
'192': 192
'193': 193
'194': 194
'195': 195
'196': 196
'197': 197
'198': 198
'199': 199
'200': '00200'
'201': '00201'
'202': '00202'
'203': '00203'
'204': '00204'
'205': '00205'
'206': '00206'
'207': '00207'
'208': 208
'209': 209
'210': '00210'
'211': '00211'
'212': '00212'
'213': '00213'
'214': '00214'
'215': '00215'
'216': '00216'
'217': '00217'
'218': 218
'219': 219
'220': '00220'
'221': '00221'
'222': '00222'
'223': '00223'
'224': '00224'
'225': '00225'
'226': '00226'
'227': '00227'
'228': 228
'229': 229
'230': '00230'
'231': '00231'
'232': '00232'
'233': '00233'
'234': '00234'
'235': '00235'
'236': '00236'
'237': '00237'
'238': 238
'239': 239
'240': '00240'
'241': '00241'
'242': '00242'
'243': '00243'
'244': '00244'
'245': '00245'
'246': '00246'
'247': '00247'
'248': 248
'249': 249
'250': '00250'
'251': '00251'
'252': '00252'
'253': '00253'
'254': '00254'
'255': '00255'
'256': '00256'
'257': '00257'
'258': 258
'259': 259
'260': '00260'
'261': '00261'
'262': '00262'
'263': '00263'
'264': '00264'
'265': '00265'
'266': '00266'
'267': '00267'
'268': 268
'269': 269
'270': '00270'
'271': '00271'
'272': '00272'
'273': '00273'
'274': '00274'
'275': '00275'
'276': '00276'
'277': '00277'
'278': 278
'279': 279
'280': 280
'281': 281
'282': 282
'283': 283
'284': 284
'285': 285
'286': 286
'287': 287
'288': 288
'289': 289
'290': 290
'291': 291
'292': 292
'293': 293
'294': 294
'295': 295
'296': 296
'297': 297
'298': 298
'299': 299
'300': '00300'
'301': '00301'
'302': '00302'
'303': '00303'
'304': '00304'
'305': '00305'
'306': '00306'
'307': '00307'
'308': 308
'309': 309
'310': '00310'
'311': '00311'
'312': '00312'
'313': '00313'
'314': '00314'
'315': '00315'
'316': '00316'
'317': '00317'
'318': 318
'319': 319
'320': '00320'
'321': '00321'
'322': '00322'
'323': '00323'
'324': '00324'
'325': '00325'
'326': '00326'
'327': '00327'
'328': 328
'329': 329
'330': '00330'
'331': '00331'
'332': '00332'
'333': '00333'
'334': '00334'
'335': '00335'
'336': '00336'
'337': '00337'
'338': 338
'339': 339
'340': '00340'
'341': '00341'
'342': '00342'
'343': '00343'
'344': '00344'
'345': '00345'
'346': '00346'
'347': '00347'
'348': 348
'349': 349
'350': '00350'
'351': '00351'
'352': '00352'
'353': '00353'
'354': '00354'
'355': '00355'
'356': '00356'
'357': '00357'
'358': 358
'359': 359
'360': '00360'
'361': '00361'
'362': '00362'
'363': '00363'
'364': '00364'
'365': '00365'
'366': '00366'
'367': '00367'
'368': 368
'369': 369
'370': '00370'
'371': '00371'
'372': '00372'
'373': '00373'
'374': '00374'
'375': '00375'
'376': '00376'
'377': '00377'
'378': 378
'379': 379
'380': 380
'381': 381
'382': 382
'383': 383
'384': 384
'385': 385
'386': 386
'387': 387
'388': 388
'389': 389
'390': 390
'391': 391
'392': 392
'393': 393
'394': 394
'395': 395
'396': 396
'397': 397
'398': 398
'399': 399
'400': '00400'
'401': '00401'
'402': '00402'
'403': '00403'
'404': '00404'
'405': '00405'
'406': '00406'
'407': '00407'
'408': 408
'409': 409
'410': '00410'
'411': '00411'
'412': '00412'
'413': '00413'
'414': '00414'
'415': '00415'
'416': '00416'
'417': '00417'
'418': 418
'419': 419
'420': '00420'
'421': '00421'
'422': '00422'
'423': '00423'
'424': '00424'
'425': '00425'
'426': '00426'
'427': '00427'
'428': 428
'429': 429
'430': '00430'
'431': '00431'
'432': '00432'
'433': '00433'
'434': '00434'
'435': '00435'
'436': '00436'
'437': '00437'
'438': 438
'439': 439
'440': '00440'
'441': '00441'
'442': '00442'
'443': '00443'
'444': '00444'
'445': '00445'
'446': '00446'
'447': '00447'
'448': 448
'449': 449
'450': '00450'
'451': '00451'
'452': '00452'
'453': '00453'
'454': '00454'
'455': '00455'
'456': '00456'
'457': '00457'
'458': 458
'459': 459
'460': '00460'
'461': '00461'
'462': '00462'
'463': '00463'
'464': '00464'
'465': '00465'
'466': '00466'
'467': '00467'
'468': 468
'469': 469
'470': '00470'
'471': '00471'
'472': '00472'
'473': '00473'
'474': '00474'
'475': '00475'
'476': '00476'
'477': '00477'
'478': 478
'479': 479
'480': 480
'481': 481
'482': 482
'483': 483
'484': 484
'485': 485
'486': 486
'487': 487
'488': 488
'489': 489
'490': 490
'491': 491
'492': 492
'493': 493
'494': 494
'495': 495
'496': 496
'497': 497
'498': 498
'499': 499
'500': '00500'
'501': '00501'
'502': '00502'
'503': '00503'
'504': '00504'
'505': '00505'
'506': '00506'
'507': '00507'
'508': 508
'509': 509
'510': '00510'
'511': '00511'
'512': '00512'
'513': '00513'
'514': '00514'
'515': '00515'
'516': '00516'
'517': '00517'
'518': 518
'519': 519
'520': '00520'
'521': '00521'
'522': '00522'
'523': '00523'
'524': '00524'
'525': '00525'
'526': '00526'
'527': '00527'
'528': 528
'529': 529
'530': '00530'
'531': '00531'
'532': '00532'
'533': '00533'
'534': '00534'
'535': '00535'
'536': '00536'
'537': '00537'
'538': 538
'539': 539
'540': '00540'
'541': '00541'
'542': '00542'
'543': '00543'
'544': '00544'
'545': '00545'
'546': '00546'
'547': '00547'
'548': 548
'549': 549
'550': '00550'
'551': '00551'
'552': '00552'
'553': '00553'
'554': '00554'
'555': '00555'
'556': '00556'
'557': '00557'
'558': 558
'559': 559
'560': '00560'
'561': '00561'
'562': '00562'
'563': '00563'
'564': '00564'
'565': '00565'
'566': '00566'
'567': '00567'
'568': 568
'569': 569
'570': '00570'
'571': '00571'
'572': '00572'
'573': '00573'
'574': '00574'
'575': '00575'
'576': '00576'
'577': '00577'
'578': 578
'579': 579
'580': 580
'581': 581
'582': 582
'583': 583
'584': 584
'585': 585
'586': 586
'587': 587
'588': 588
'589': 589
'590': 590
'591': 591
'592': 592
'593': 593
'594': 594
'595': 595
'596': 596
'597': 597
'598': 598
'599': 599
'600': '00600'
'601': '00601'
'602': '00602'
'603': '00603'
'604': '00604'
'605': '00605'
'606': '00606'
'607': '00607'
'608': 608
'609': 609
'610': '00610'
'611': '00611'
'612': '00612'
'613': '00613'
'614': '00614'
'615': '00615'
'616': '00616'
'617': '00617'
'618': 618
'619': 619
'620': '00620'
'621': '00621'
'622': '00622'
'623': '00623'
'624': '00624'
'625': '00625'
'626': '00626'
'627': '00627'
'628': 628
'629': 629
'630': '00630'
'631': '00631'
'632': '00632'
'633': '00633'
'634': '00634'
'635': '00635'
'636': '00636'
'637': '00637'
'638': 638
'639': 639
'640': '00640'
'641': '00641'
'642': '00642'
'643': '00643'
'644': '00644'
'645': '00645'
'646': '00646'
'647': '00647'
'648': 648
'649': 649
'650': '00650'
'651': '00651'
'652': '00652'
'653': '00653'
'654': '00654'
'655': '00655'
'656': '00656'
'657': '00657'
'658': 658
'659': 659
'660': '00660'
'661': '00661'
'662': '00662'
'663': '00663'
'664': '00664'
'665': '00665'
'666': '00666'
'667': '00667'
'668': 668
'669': 669
'670': '00670'
'671': '00671'
'672': '00672'
'673': '00673'
'674': '00674'
'675': '00675'
'676': '00676'
'677': '00677'
'678': 678
'679': 679
'680': 680
'681': 681
'682': 682
'683': 683
'684': 684
'685': 685
'686': 686
'687': 687
'688': 688
'689': 689
'690': 690
'691': 691
'692': 692
'693': 693
'694': 694
'695': 695
'696': 696
'697': 697
'698': 698
'699': 699
'700': '00700'
'701': '00701'
'702': '00702'
'703': '00703'
'704': '00704'
'705': '00705'
'706': '00706'
'707': '00707'
'708': 708
'709': 709
'710': '00710'
'711': '00711'
'712': '00712'
'713': '00713'
'714': '00714'
'715': '00715'
'716': '00716'
'717': '00717'
'718': 718
'719': 719
'720': '00720'
'721': '00721'
'722': '00722'
'723': '00723'
'724': '00724'
'725': '00725'
'726': '00726'
'727': '00727'
'728': 728
'729': 729
'730': '00730'
'731': '00731'
'732': '00732'
'733': '00733'
'734': '00734'
'735': '00735'
'736': '00736'
'737': '00737'
'738': 738
'739': 739
'740': '00740'
'741': '00741'
'742': '00742'
'743': '00743'
'744': '00744'
'745': '00745'
'746': '00746'
'747': '00747'
'748': 748
'749': 749
'750': '00750'
'751': '00751'
'752': '00752'
'753': '00753'
'754': '00754'
'755': '00755'
'756': '00756'
'757': '00757'
'758': 758
'759': 759
'760': '00760'
'761': '00761'
'762': '00762'
'763': '00763'
'764': '00764'
'765': '00765'
'766': '00766'
'767': '00767'
'768': 768
'769': 769
'770': '00770'
'771': '00771'
'772': '00772'
'773': '00773'
'774': '00774'
'775': '00775'
'776': '00776'
'777': '00777'
'778': 778
'779': 779
'780': 780
'781': 781
'782': 782
'783': 783
'784': 784
'785': 785
'786': 786
'787': 787
'788': 788
'789': 789
'790': 790
'791': 791
'792': 792
'793': 793
'794': 794
'795': 795
'796': 796
'797': 797
'798': 798
'799': 799
'800': 800
'801': 801
'802': 802
'803': 803
'804': 804
'805': 805
'806': 806
'807': 807
'808': 808
'809': 809
'810': 810
'811': 811
'812': 812
'813': 813
'814': 814
'815': 815
'816': 816
'817': 817
'818': 818
'819': 819
'820': 820
'821': 821
'822': 822
'823': 823
'824': 824
'825': 825
'826': 826
'827': 827
'828': 828
'829': 829
'830': 830
'831': 831
'832': 832
'833': 833
'834': 834
'835': 835
'836': 836
'837': 837
'838': 838
'839': 839
'840': 840
'841': 841
'842': 842
'843': 843
'844': 844
'845': 845
'846': 846
'847': 847
'848': 848
'849': 849
'850': 850
'851': 851
'852': 852
'853': 853
'854': 854
'855': 855
'856': 856
'857': 857
'858': 858
'859': 859
'860': 860
'861': 861
'862': 862
'863': 863
'864': 864
'865': 865
'866': 866
'867': 867
'868': 868
'869': 869
'870': 870
'871': 871
'872': 872
'873': 873
'874': 874
'875': 875
'876': 876
'877': 877
'878': 878
'879': 879
'880': 880
'881': 881
'882': 882
'883': 883
'884': 884
'885': 885
'886': 886
'887': 887
'888': 888
'889': 889
'890': 890
'891': 891
'892': 892
'893': 893
'894': 894
'895': 895
'896': 896
'897': 897
'898': 898
'899': 899
'900': 900
'901': 901
'902': 902
'903': 903
'904': 904
'905': 905
'906': 906
'907': 907
'908': 908
'909': 909
'910': 910
'911': 911
'912': 912
'913': 913
'914': 914
'915': 915
'916': 916
'917': 917
'918': 918
'919': 919
'920': 920
'921': 921
'922': 922
'923': 923
'924': 924
'925': 925
'926': 926
'927': 927
'928': 928
'929': 929
'930': 930
'931': 931
'932': 932
'933': 933
'934': 934
'935': 935
'936': 936
'937': 937
'938': 938
'939': 939
'940': 940
'941': 941
'942': 942
'943': 943
'944': 944
'945': 945
'946': 946
'947': 947
'948': 948
'949': 949
'950': 950
'951': 951
'952': 952
'953': 953
'954': 954
'955': 955
'956': 956
'957': 957
'958': 958
'959': 959
'960': 960
'961': 961
'962': 962
'963': 963
'964': 964
'965': 965
'966': 966
'967': 967
'968': 968
'969': 969
'970': 970
'971': 971
'972': 972
'973': 973
'974': 974
'975': 975
'976': 976
'977': 977
'978': 978
'979': 979
'980': 980
'981': 981
'982': 982
'983': 983
'984': 984
'985': 985
'986': 986
'987': 987
'988': 988
'989': 989
'990': 990
'991': 991
'992': 992
'993': 993
'994': 994
'995': 995
'996': 996
'997': 997
'998': 998
'999': 999
splits:
- name: train
num_bytes: 13905951000
num_examples: 1000000
download_size: 14091259942
dataset_size: 13905951000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-4.0
task_categories:
- image-classification
pretty_name: FractalDB 1k
size_categories:
- 10M<n<100M
---
# FractalDB 1k
FractalDB 1k dataset from [Pre-training without Natural Images](https://hirokatsukataoka16.github.io/Pretraining-without-Natural-Images/).
[Original repo](https://github.com/hirokatsukataoka16/FractalDB-Pretrained-ResNet-PyTorch) | [Project page](https://hirokatsukataoka16.github.io/Pretraining-without-Natural-Images/) | [arXiv](https://arxiv.org/abs/2101.08515)
## Citing
```bibtex
@article{KataokaIJCV2022,
author={Kataoka, Hirokatsu and Okayasu, Kazushige and Matsumoto, Asato and Yamagata, Eisuke and Yamada, Ryosuke and Inoue, Nakamasa and Nakamura, Akio and Satoh, Yutaka},
title={Pre-training without Natural Images},
article={International Journal on Computer Vision (IJCV)},
year={2022},
}
@inproceedings{KataokaACCV2020,
author={Kataoka, Hirokatsu and Okayasu, Kazushige and Matsumoto, Asato and Yamagata, Eisuke and Yamada, Ryosuke and Inoue, Nakamasa and Nakamura, Akio and Satoh, Yutaka},
title={Pre-training without Natural Images},
booktitle={Asian Conference on Computer Vision (ACCV)},
year={2020},
}
@misc{kataoka2021pretraining,
title={Pre-training without Natural Images},
author={Hirokatsu Kataoka and Kazushige Okayasu and Asato Matsumoto and Eisuke Yamagata and Ryosuke Yamada and Nakamasa Inoue and Akio Nakamura and Yutaka Satoh},
year={2021},
eprint={2101.08515},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` |
Deojoandco/ah_openai_dialog_annotation_v1 | ---
dataset_info:
features:
- name: url
dtype: string
- name: id
dtype: string
- name: num_comments
dtype: int64
- name: name
dtype: string
- name: title
dtype: string
- name: body
dtype: string
- name: score
dtype: int64
- name: upvote_ratio
dtype: float64
- name: distinguished
dtype: 'null'
- name: over_18
dtype: bool
- name: created_utc
dtype: int64
- name: comments
list:
- name: body
dtype: string
- name: created_utc
dtype: float64
- name: distinguished
dtype: string
- name: id
dtype: string
- name: permalink
dtype: string
- name: score
dtype: int64
- name: best_num_comments
dtype: int64
- name: query
dtype: string
- name: dialog
dtype: string
- name: dialog_success
dtype: bool
- name: annotation_error
dtype: bool
- name: annotation
struct:
- name: success
dtype: bool
- name: text
dtype: string
splits:
- name: train
num_bytes: 2948432
num_examples: 297
download_size: 1766339
dataset_size: 2948432
---
# Dataset Card for "ah_openai_dialog_annotation_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_15 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1024449840
num_examples: 199620
download_size: 1045910583
dataset_size: 1024449840
---
# Dataset Card for "chunk_15"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-medical_questions_pairs-default-d0c070-68078145610 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- medical_questions_pairs
eval_info:
task: summarization
model: ARTeLab/it5-summarization-ilpost
metrics: []
dataset_name: medical_questions_pairs
dataset_config: default
dataset_split: train
col_mapping:
text: question_1
target: question_2
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ARTeLab/it5-summarization-ilpost
* Dataset: medical_questions_pairs
* Config: default
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@halmj](https://huggingface.co/halmj) for evaluating this model. |
sade-adrien/redpajama_v2_32k | ---
dataset_info:
features:
- name: raw_content
dtype: string
- name: doc_id
dtype: string
- name: meta
dtype: string
- name: quality_signals
dtype: string
splits:
- name: train
num_bytes: 3919516915.1525173
num_examples: 364920
download_size: 18972252318
dataset_size: 3919516915.1525173
---
# Dataset Card for "redpajama_v2_32k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sophiex/hh-rlhf | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 205855797
num_examples: 160800
- name: test
num_bytes: 11020390
num_examples: 8552
download_size: 128272482
dataset_size: 216876187
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
manishiitg/CogStack-QA | ---
dataset_info:
features:
- name: org_text
dtype: string
- name: raw_id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 18045985
num_examples: 24665
download_size: 7759274
dataset_size: 18045985
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
eswardivi/Aksharantar | ---
dataset_info:
- config_name: asm
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 7246553
num_examples: 178630
- name: valid
num_bytes: 155473
num_examples: 3788
- name: test
num_bytes: 215853
num_examples: 5506
download_size: 4806305
dataset_size: 7617879
- config_name: ben
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 53625021
num_examples: 1231428
- name: valid
num_bytes: 425704
num_examples: 11276
- name: test
num_bytes: 536999
num_examples: 14167
download_size: 33797771
dataset_size: 54587724
- config_name: brx
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 1549176
num_examples: 35618
- name: valid
num_bytes: 127620
num_examples: 3068
- name: test
num_bytes: 158976
num_examples: 4081
download_size: 1041579
dataset_size: 1835772
- config_name: doi
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 50960
num_examples: 1584
- name: test
num_bytes: 62772
num_examples: 2000
download_size: 75793
dataset_size: 113732
- config_name: guj
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 48426490
num_examples: 1143212
- name: valid
num_bytes: 457631
num_examples: 12419
- name: test
num_bytes: 690823
num_examples: 18077
download_size: 31145762
dataset_size: 49574944
- config_name: hin
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 52038534
num_examples: 1299155
- name: valid
num_bytes: 223121
num_examples: 6357
- name: test
num_bytes: 368927
num_examples: 10112
download_size: 34053230
dataset_size: 52630582
- config_name: kan
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 158229246
num_examples: 2906728
- name: valid
num_bytes: 318367
num_examples: 7025
- name: test
num_bytes: 534114
num_examples: 11380
download_size: 91749260
dataset_size: 159081727
- config_name: kas
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 1310641
num_examples: 46635
- name: valid
num_bytes: 117768
num_examples: 4456
- name: test
num_bytes: 175480
num_examples: 6908
download_size: 1175597
dataset_size: 1603889
- config_name: kok
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 29164783
num_examples: 612525
- name: valid
num_bytes: 154507
num_examples: 3502
- name: test
num_bytes: 194477
num_examples: 5042
download_size: 17786669
dataset_size: 29513767
- config_name: mai
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 11088031
num_examples: 282639
- name: valid
num_bytes: 145082
num_examples: 3790
- name: test
num_bytes: 195832
num_examples: 5449
download_size: 7353930
dataset_size: 11428945
- config_name: mal
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 255792875
num_examples: 4100621
- name: valid
num_bytes: 364734
num_examples: 7613
- name: test
num_bytes: 613721
num_examples: 12451
download_size: 141329273
dataset_size: 256771330
- config_name: mar
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 70379039
num_examples: 1452748
- name: valid
num_bytes: 306473
num_examples: 7646
- name: test
num_bytes: 501632
num_examples: 12190
download_size: 42714793
dataset_size: 71187144
- config_name: mni
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 359476
num_examples: 10060
- name: valid
num_bytes: 112250
num_examples: 3260
- name: test
num_bytes: 166708
num_examples: 4889
download_size: 384776
dataset_size: 638434
- config_name: nep
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 115703649
num_examples: 2397414
- name: valid
num_bytes: 128685
num_examples: 2804
- name: test
num_bytes: 161326
num_examples: 4101
download_size: 70685486
dataset_size: 115993660
- config_name: ori
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 15223026
num_examples: 346492
- name: valid
num_bytes: 133701
num_examples: 3093
- name: test
num_bytes: 168260
num_examples: 4228
download_size: 9415265
dataset_size: 15524987
- config_name: pan
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 18625789
num_examples: 514724
- name: valid
num_bytes: 280876
num_examples: 8880
- name: test
num_bytes: 363793
num_examples: 11237
download_size: 12634738
dataset_size: 19270458
- config_name: san
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 103031038
num_examples: 1813369
- name: valid
num_bytes: 175843
num_examples: 3398
- name: test
num_bytes: 218125
num_examples: 5302
download_size: 61369090
dataset_size: 103425006
- config_name: sid
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 1590769
num_examples: 59715
- name: valid
num_bytes: 207035
num_examples: 8375
- name: test
num_bytes: 153505
num_examples: 6407
download_size: 1471769
dataset_size: 1951309
- config_name: tam
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 189446572
num_examples: 3230902
- name: valid
num_bytes: 405125
num_examples: 8824
- name: test
num_bytes: 512678
num_examples: 11499
download_size: 103185235
dataset_size: 190364375
- config_name: tel
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 125668188
num_examples: 2429562
- name: valid
num_bytes: 327494
num_examples: 7681
- name: test
num_bytes: 433170
num_examples: 10260
download_size: 75120677
dataset_size: 126428852
- config_name: urd
features:
- name: native word
dtype: string
- name: english word
dtype: string
splits:
- name: train
num_bytes: 21546318
num_examples: 699024
- name: valid
num_bytes: 317819
num_examples: 12419
- name: test
num_bytes: 384213
num_examples: 14878
download_size: 16824949
dataset_size: 22248350
configs:
- config_name: asm
data_files:
- split: train
path: asm/train-*
- split: valid
path: asm/valid-*
- split: test
path: asm/test-*
- config_name: ben
data_files:
- split: train
path: ben/train-*
- split: valid
path: ben/valid-*
- split: test
path: ben/test-*
- config_name: brx
data_files:
- split: train
path: brx/train-*
- split: valid
path: brx/valid-*
- split: test
path: brx/test-*
- config_name: doi
data_files:
- split: train
path: doi/train-*
- split: test
path: doi/test-*
- config_name: guj
data_files:
- split: train
path: guj/train-*
- split: valid
path: guj/valid-*
- split: test
path: guj/test-*
- config_name: hin
data_files:
- split: train
path: hin/train-*
- split: valid
path: hin/valid-*
- split: test
path: hin/test-*
- config_name: kan
data_files:
- split: train
path: kan/train-*
- split: valid
path: kan/valid-*
- split: test
path: kan/test-*
- config_name: kas
data_files:
- split: train
path: kas/train-*
- split: valid
path: kas/valid-*
- split: test
path: kas/test-*
- config_name: kok
data_files:
- split: train
path: kok/train-*
- split: valid
path: kok/valid-*
- split: test
path: kok/test-*
- config_name: mai
data_files:
- split: train
path: mai/train-*
- split: valid
path: mai/valid-*
- split: test
path: mai/test-*
- config_name: mal
data_files:
- split: train
path: mal/train-*
- split: valid
path: mal/valid-*
- split: test
path: mal/test-*
- config_name: mar
data_files:
- split: train
path: mar/train-*
- split: valid
path: mar/valid-*
- split: test
path: mar/test-*
- config_name: mni
data_files:
- split: train
path: mni/train-*
- split: valid
path: mni/valid-*
- split: test
path: mni/test-*
- config_name: nep
data_files:
- split: train
path: nep/train-*
- split: valid
path: nep/valid-*
- split: test
path: nep/test-*
- config_name: ori
data_files:
- split: train
path: ori/train-*
- split: valid
path: ori/valid-*
- split: test
path: ori/test-*
- config_name: pan
data_files:
- split: train
path: pan/train-*
- split: valid
path: pan/valid-*
- split: test
path: pan/test-*
- config_name: san
data_files:
- split: train
path: san/train-*
- split: valid
path: san/valid-*
- split: test
path: san/test-*
- config_name: sid
data_files:
- split: train
path: sid/train-*
- split: valid
path: sid/valid-*
- split: test
path: sid/test-*
- config_name: tam
data_files:
- split: train
path: tam/train-*
- split: valid
path: tam/valid-*
- split: test
path: tam/test-*
- config_name: tel
data_files:
- split: train
path: tel/train-*
- split: valid
path: tel/valid-*
- split: test
path: tel/test-*
- config_name: urd
data_files:
- split: train
path: urd/train-*
- split: valid
path: urd/valid-*
- split: test
path: urd/test-*
language:
- asm
- ben
- brx
- doi
- guj
- hin
- kan
- kas
- kok
- mai
- mal
- mar
- mni
- nep
- ori
- pan
- san
- sid
- tam
- tel
- urd
license: cc
multilinguality:
- multilingual
pretty_name: Aksharantar
size_categories:
- 10M<n<100M
---
# Aksharantar
This Dataset is derived from [Aksharantar](https://huggingface.co/datasets/ai4bharat/Aksharantar).
# Languages
| <!-- --> | <!-- --> | <!-- --> | <!-- --> | <!-- --> | <!-- --> |
| -------------- | -------------- | -------------- | --------------- | -------------- | ------------- |
| Assamese (asm) | Hindi (hin) | Maithili (mai) | Marathi (mar) | Punjabi (pan) | Tamil (tam) |
| Bengali (ben) | Kannada (kan) | Malayalam (mal)| Nepali (nep) | Sanskrit (san) | Telugu (tel) |
| Bodo(brx) | Kashmiri (kas) | Manipuri (mni) | Oriya (ori) | Sindhi (snd) | Urdu (urd) |
| Gujarati (guj) | Konkani (kok) | Dogri (doi) |
|
open-llm-leaderboard/details_Weyaxi__Einstein-v5-v0.2-7B | ---
pretty_name: Evaluation run of Weyaxi/Einstein-v5-v0.2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Einstein-v5-v0.2-7B](https://huggingface.co/Weyaxi/Einstein-v5-v0.2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Einstein-v5-v0.2-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T21:09:37.228677](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-v5-v0.2-7B/blob/main/results_2024-03-27T21-09-37.228677.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.612286564752706,\n\
\ \"acc_stderr\": 0.032839983165383065,\n \"acc_norm\": 0.6135779860343825,\n\
\ \"acc_norm_stderr\": 0.03350956178751591,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5259333753586267,\n\
\ \"mc2_stderr\": 0.015070357329952046\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642463,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.01425856388051378\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6148177653853814,\n\
\ \"acc_stderr\": 0.004856437955719861,\n \"acc_norm\": 0.8098984266082454,\n\
\ \"acc_norm_stderr\": 0.003915792315457802\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798328,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798328\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319619,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319619\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467383,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467383\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943245,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943245\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.03031371053819889,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.03031371053819889\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306422,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306422\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n\
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016015,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016015\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676187,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676187\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705048,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705048\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.02308663508684141,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.02308663508684141\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7931034482758621,\n\
\ \"acc_stderr\": 0.014485656041669178,\n \"acc_norm\": 0.7931034482758621,\n\
\ \"acc_norm_stderr\": 0.014485656041669178\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475358,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475358\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826517,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826517\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n\
\ \"acc_stderr\": 0.012733671880342507,\n \"acc_norm\": 0.4621903520208605,\n\
\ \"acc_norm_stderr\": 0.012733671880342507\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877746,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825362,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825362\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328913,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826369,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826369\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5259333753586267,\n\
\ \"mc2_stderr\": 0.015070357329952046\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722769\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5966641394996209,\n \
\ \"acc_stderr\": 0.01351265478181471\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Einstein-v5-v0.2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|arc:challenge|25_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|gsm8k|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hellaswag|10_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T21-09-37.228677.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T21-09-37.228677.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- '**/details_harness|winogrande|5_2024-03-27T21-09-37.228677.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T21-09-37.228677.parquet'
- config_name: results
data_files:
- split: 2024_03_27T21_09_37.228677
path:
- results_2024-03-27T21-09-37.228677.parquet
- split: latest
path:
- results_2024-03-27T21-09-37.228677.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Einstein-v5-v0.2-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-v5-v0.2-7B](https://huggingface.co/Weyaxi/Einstein-v5-v0.2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Einstein-v5-v0.2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T21:09:37.228677](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-v5-v0.2-7B/blob/main/results_2024-03-27T21-09-37.228677.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.612286564752706,
"acc_stderr": 0.032839983165383065,
"acc_norm": 0.6135779860343825,
"acc_norm_stderr": 0.03350956178751591,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5259333753586267,
"mc2_stderr": 0.015070357329952046
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642463,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.01425856388051378
},
"harness|hellaswag|10": {
"acc": 0.6148177653853814,
"acc_stderr": 0.004856437955719861,
"acc_norm": 0.8098984266082454,
"acc_norm_stderr": 0.003915792315457802
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798328,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798328
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319619,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319619
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467383,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467383
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943245,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943245
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.03031371053819889,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.03031371053819889
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306422,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306422
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.02486499515976775,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.02486499515976775
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016015,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016015
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676187,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676187
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705048,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705048
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.02308663508684141,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.02308663508684141
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.014485656041669178,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.014485656041669178
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475358,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475358
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826517,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826517
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342507,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342507
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825362,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825362
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328913,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826369,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826369
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5259333753586267,
"mc2_stderr": 0.015070357329952046
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722769
},
"harness|gsm8k|5": {
"acc": 0.5966641394996209,
"acc_stderr": 0.01351265478181471
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
5roop/juzne_vesti | ---
language:
- sr
license: cc-by-sa-4.0
size_categories:
- 10K<n<100K
pretty_name: Južne Vesti
dataset_info:
features:
- name: audio
dtype: audio
- name: split
dtype: string
- name: transcript
dtype: string
- name: norm_transcript
dtype: string
- name: guest_name
dtype: string
- name: host
dtype: string
- name: guest_description
dtype: string
- name: speaker_breakdown
dtype: string
splits:
- name: train
num_bytes: 4687838374.879606
num_examples: 8648
- name: test
num_bytes: 584596072.5389507
num_examples: 1081
- name: dev
num_bytes: 583281117.6094437
num_examples: 1082
download_size: 5813877393
dataset_size: 5855715565.028001
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
handle:
- http://hdl.handle.net/11356/1679
---
# ASR training dataset for Serbian JuzneVesti-SR v1.0
hdl: http://hdl.handle.net/11356/1679
The JuzneVesti-SR dataset consists of audio recordings and manual transcripts from the Južne Vesti website and its host show called '15 minuta' (https://www.juznevesti.com/Tagovi/Intervju-15-minuta.sr.html).
The processing of the audio and its alignment to the manual transcripts followed the pipeline of the ParlaSpeech-HR dataset (http://hdl.handle.net/11356/1494) as closely as possible.
Segments in this dataset range from 2 to 30 seconds.
Train-dev-test split has been performed with 80:10:10 ratio.
As with the ParlaSpeech-HR dataset, two transcriptions are provided; one with transcripts in their raw form (with punctuation, capital letters, numerals) and another normalised with the same rule-based normaliser as was used in ParlaSpeech-HR dataset creation, which is lowercased, punctuation is removed and numerals are replaced with words.
Original transcripts were collected with the help of the ReLDI Centre Belgrade (https://reldi.spur.uzh.ch).
Please cite as
```
@misc{11356/1679,
title = {{ASR} training dataset for Serbian {JuzneVesti}-{SR} v1.0},
author = {Rupnik, Peter and Ljube{\v s}i{\'c}, Nikola},
url = {http://hdl.handle.net/11356/1679},
note = {Slovenian language resource repository {CLARIN}.{SI}},
copyright = {Creative Commons - Attribution-{ShareAlike} 4.0 International ({CC} {BY}-{SA} 4.0)},
issn = {2820-4042},
year = {2022} }
``` |
Adun/dharma-thai-001 | ---
license: apache-2.0
---
|
cideon00/villm | ---
dataset_info:
features:
- name: text
dtype: string
- name: tok_len
dtype: int64
splits:
- name: train
num_bytes: 1411182336.1899912
num_examples: 512774
download_size: 328694427
dataset_size: 1411182336.1899912
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "villm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
afschowdhury/mujib-dataset | ---
task_categories:
- question-answering
- sentence-similarity
language:
- bn
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-V2-DPO | ---
pretty_name: Evaluation run of Kukedlc/NeuralKrishna-7B-V2-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/NeuralKrishna-7B-V2-DPO](https://huggingface.co/Kukedlc/NeuralKrishna-7B-V2-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-V2-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T01:14:01.328114](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-V2-DPO/blob/main/results_2024-03-01T01-14-01.328114.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6504934512192767,\n\
\ \"acc_stderr\": 0.03216887648280307,\n \"acc_norm\": 0.6499357972290111,\n\
\ \"acc_norm_stderr\": 0.03284019095002193,\n \"mc1\": 0.616891064871481,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.7618974672028592,\n\
\ \"mc2_stderr\": 0.014153385844790888\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7201365187713311,\n \"acc_stderr\": 0.013119040897725922,\n\
\ \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.01280827357392711\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7202748456482773,\n\
\ \"acc_stderr\": 0.004479467619464801,\n \"acc_norm\": 0.889663413662617,\n\
\ \"acc_norm_stderr\": 0.0031266851694200484\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922435,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922435\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.0189754279205072,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.0189754279205072\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.616891064871481,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.7618974672028592,\n\
\ \"mc2_stderr\": 0.014153385844790888\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598484\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6808188021228203,\n \
\ \"acc_stderr\": 0.012840345676251651\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/NeuralKrishna-7B-V2-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-14-01.328114.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-14-01.328114.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- '**/details_harness|winogrande|5_2024-03-01T01-14-01.328114.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T01-14-01.328114.parquet'
- config_name: results
data_files:
- split: 2024_03_01T01_14_01.328114
path:
- results_2024-03-01T01-14-01.328114.parquet
- split: latest
path:
- results_2024-03-01T01-14-01.328114.parquet
---
# Dataset Card for Evaluation run of Kukedlc/NeuralKrishna-7B-V2-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/NeuralKrishna-7B-V2-DPO](https://huggingface.co/Kukedlc/NeuralKrishna-7B-V2-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-V2-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T01:14:01.328114](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralKrishna-7B-V2-DPO/blob/main/results_2024-03-01T01-14-01.328114.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6504934512192767,
"acc_stderr": 0.03216887648280307,
"acc_norm": 0.6499357972290111,
"acc_norm_stderr": 0.03284019095002193,
"mc1": 0.616891064871481,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.7618974672028592,
"mc2_stderr": 0.014153385844790888
},
"harness|arc:challenge|25": {
"acc": 0.7201365187713311,
"acc_stderr": 0.013119040897725922,
"acc_norm": 0.7406143344709898,
"acc_norm_stderr": 0.01280827357392711
},
"harness|hellaswag|10": {
"acc": 0.7202748456482773,
"acc_stderr": 0.004479467619464801,
"acc_norm": 0.889663413662617,
"acc_norm_stderr": 0.0031266851694200484
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163224,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163224
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922435,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922435
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.0189754279205072,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.0189754279205072
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.616891064871481,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.7618974672028592,
"mc2_stderr": 0.014153385844790888
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598484
},
"harness|gsm8k|5": {
"acc": 0.6808188021228203,
"acc_stderr": 0.012840345676251651
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
result-kand2-sdxl-wuerst-karlo/7fb2a617 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 231
num_examples: 10
download_size: 1418
dataset_size: 231
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "7fb2a617"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abhinit27052001/01apr-packet | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 1799582.0
num_examples: 4
download_size: 1790896
dataset_size: 1799582.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EvaKlimentova/Diffusion-Evodiff_knots | ---
dataset_info:
features:
- name: ID
dtype: string
- name: Sequence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3650734.2822767235
num_examples: 9724
- name: test
num_bytes: 405845.71772327623
num_examples: 1081
download_size: 3975390
dataset_size: 4056580.0
---
# Dataset Card for "Diffusion-Evodiff_knots"
## Dataset of proteins from Evodiff
Dataset of proteins generated through Evodiff
Contains knotted proteins (tested with omegafold and topoly) and unknotted proteins in ratio 1:4
The dataset is splitted into a train set (majority, 90 %) and evaluation set (only 10 %)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shubhamg2208/lexicap | ---
lexicap:
- found
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: 'Lexicap: Lex Fridman Podcast Whisper captions'
size_categories:
- n<1K
source_datasets:
- original
tags:
- karpathy,whisper,openai
task_categories:
- text-classification
- text-generation
task_ids:
- sentiment-analysis
- dialogue-modeling
- language-modeling
---
# Dataset Card for Lexicap
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
-
## Dataset Structure
### Data Instances
Train and test dataset.
j
### Data Fields
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
### Contributions
|
Tommy0201/pidgin-to-english | ---
dataset_info:
features:
- name: pidgin
dtype: string
- name: english
dtype: string
splits:
- name: train
num_bytes: 29393901
num_examples: 116331
download_size: 20196011
dataset_size: 29393901
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DavidVivancos/MindBigData2022_VisMNIST_MU2 | ---
license: odbl
---
|
Emm9625/0404-cnn_dailymail-3.0.0 | ---
dataset_info:
features:
- name: article
dtype: string
- name: highlights
dtype: string
- name: id
dtype: string
- name: article_length
dtype: int64
- name: highlights_length
dtype: int64
- name: topic
dtype: 'null'
- name: topic_score
dtype: 'null'
splits:
- name: train
num_bytes: 1262177442
num_examples: 287113
- name: test
num_bytes: 49992237
num_examples: 11490
- name: validation
num_bytes: 57808692
num_examples: 13368
download_size: 837601664
dataset_size: 1369978371
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
zakAiDevops/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kheopss/dpo_humorous_tone_dataset | ---
dataset_info:
features:
- name: system
dtype: string
- name: prompt
dtype: string
- name: rejected
dtype: string
- name: chosen
dtype: string
splits:
- name: train
num_bytes: 444251
num_examples: 114
download_size: 231447
dataset_size: 444251
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
justinlamlamlam/wiki_encoding_v0_p0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 7504518
num_examples: 1615
download_size: 6606677
dataset_size: 7504518
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/friedrich_der_grosse_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of friedrich_der_grosse/フリードリヒ・デア・グローセ/腓特烈大帝 (Azur Lane)
This is the dataset of friedrich_der_grosse/フリードリヒ・デア・グローセ/腓特烈大帝 (Azur Lane), containing 416 images and their tags.
The core tags of this character are `black_hair, breasts, long_hair, horns, yellow_eyes, red_horns, hair_over_one_eye, large_breasts, very_long_hair, mechanical_horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 416 | 704.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_der_grosse_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 416 | 355.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_der_grosse_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1041 | 750.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_der_grosse_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 416 | 600.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_der_grosse_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1041 | 1.11 GiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_der_grosse_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/friedrich_der_grosse_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, cleavage, elbow_gloves, solo, black_gloves, bridal_veil, looking_at_viewer, official_alternate_costume, wedding_dress, simple_background, smile, white_background |
| 1 | 25 |  |  |  |  |  | 1girl, bare_shoulders, red_gloves, solo, looking_at_viewer, black_dress, black_thighhighs, cleavage, zettai_ryouiki, rigging, baton_(conducting), smile, standing, bangs, double-breasted, holding, machinery, turret, knee_boots |
| 2 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, navel, outdoors, solo, black_bikini, cleavage, cloud, alternate_costume, day, beach, blue_sky, holding, water, bare_shoulders, food, huge_breasts, side-tie_bikini_bottom, smile |
| 3 | 7 |  |  |  |  |  | cleavage, looking_at_viewer, navel, 1girl, bare_shoulders, black_panties, solo, alternate_costume, black_bra, collarbone, simple_background, sitting, thighs, white_background, bangs, black_choker, blush, huge_breasts, mature_female, thighhighs, garter_belt, smile |
| 4 | 8 |  |  |  |  |  | 1girl, bare_shoulders, black_kimono, hair_flower, looking_at_viewer, red_flower, cleavage, off_shoulder, official_alternate_costume, solo, smile, blush, horn_ornament, huge_breasts, simple_background, upper_body, white_background, wide_sleeves, collarbone, spider_lily |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, navel, nipples, sex, vaginal, completely_nude, looking_at_viewer, pov, sweat, cum_in_pussy, girl_on_top, penis, smile, solo_focus, heart, mosaic_censoring, open_mouth, overflow, simple_background, squatting_cowgirl_position, tongue_out |
| 6 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, paizuri, solo_focus, blush, looking_at_viewer, pov, sweat, bare_shoulders, breasts_squeezed_together, cum_on_breasts, huge_breasts, nipples, bangs, black_gloves, breasts_out, censored, closed_mouth, ejaculation, elbow_gloves, indoors, nude, penis, red_gloves, smile, tongue_out |
| 7 | 9 |  |  |  |  |  | 1girl, black_thighhighs, fake_animal_ears, looking_at_viewer, rabbit_ears, solo, alternate_costume, playboy_bunny, black_leotard, bowtie, covered_navel, simple_background, wrist_cuffs, armpits, cleavage, huge_breasts, rabbit_tail, thighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_dress | cleavage | elbow_gloves | solo | black_gloves | bridal_veil | looking_at_viewer | official_alternate_costume | wedding_dress | simple_background | smile | white_background | red_gloves | black_thighhighs | zettai_ryouiki | rigging | baton_(conducting) | standing | bangs | double-breasted | holding | machinery | turret | knee_boots | navel | outdoors | black_bikini | cloud | alternate_costume | day | beach | blue_sky | water | food | huge_breasts | side-tie_bikini_bottom | black_panties | black_bra | collarbone | sitting | thighs | black_choker | blush | mature_female | thighhighs | garter_belt | black_kimono | hair_flower | red_flower | off_shoulder | horn_ornament | upper_body | wide_sleeves | spider_lily | 1boy | hetero | nipples | sex | vaginal | completely_nude | pov | sweat | cum_in_pussy | girl_on_top | penis | solo_focus | heart | mosaic_censoring | open_mouth | overflow | squatting_cowgirl_position | tongue_out | paizuri | breasts_squeezed_together | cum_on_breasts | breasts_out | censored | closed_mouth | ejaculation | indoors | nude | fake_animal_ears | rabbit_ears | playboy_bunny | black_leotard | bowtie | covered_navel | wrist_cuffs | armpits | rabbit_tail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:-----------|:---------------|:-------|:---------------|:--------------|:--------------------|:-----------------------------|:----------------|:--------------------|:--------|:-------------------|:-------------|:-------------------|:-----------------|:----------|:---------------------|:-----------|:--------|:------------------|:----------|:------------|:---------|:-------------|:--------|:-----------|:---------------|:--------|:--------------------|:------|:--------|:-----------|:--------|:-------|:---------------|:-------------------------|:----------------|:------------|:-------------|:----------|:---------|:---------------|:--------|:----------------|:-------------|:--------------|:---------------|:--------------|:-------------|:---------------|:----------------|:-------------|:---------------|:--------------|:-------|:---------|:----------|:------|:----------|:------------------|:------|:--------|:---------------|:--------------|:--------|:-------------|:--------|:-------------------|:-------------|:-----------|:-----------------------------|:-------------|:----------|:----------------------------|:-----------------|:--------------|:-----------|:---------------|:--------------|:----------|:-------|:-------------------|:--------------|:----------------|:----------------|:---------|:----------------|:--------------|:----------|:--------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | X | X | X | | X | | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | X | | X | | | X | | | | X | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | X | | | X | | | X | X | X | | | | | | | X | | | | | | X | | | | X | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | | X | | X | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | | | | X | | | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | X | | X | | X | | | | X | | X | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | X | X | X | | | | X | X | | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | | | X | | X | | | X | | | X | | X | | X | | | | | | | | | | | | | | | X | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
AdapterOcean/augmentatio-standardized_cluster_0_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 5930091
num_examples: 2359
download_size: 2668826
dataset_size: 5930091
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "augmentatio-standardized_cluster_0_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DevaMalla__llama_7b_lora | ---
pretty_name: Evaluation run of DevaMalla/llama_7b_lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DevaMalla/llama_7b_lora](https://huggingface.co/DevaMalla/llama_7b_lora) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama_7b_lora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T14:26:37.860045](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_lora/blob/main/results_2023-10-26T14-26-37.860045.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893118953,\n \"f1\": 0.0611650587248323,\n\
\ \"f1_stderr\": 0.0013990352489173911,\n \"acc\": 0.3915240971461363,\n\
\ \"acc_stderr\": 0.00940445989381676\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893118953,\n\
\ \"f1\": 0.0611650587248323,\n \"f1_stderr\": 0.0013990352489173911\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \
\ \"acc_stderr\": 0.0062982217961795855\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453934\n\
\ }\n}\n```"
repo_url: https://huggingface.co/DevaMalla/llama_7b_lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|arc:challenge|25_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T14_26_37.860045
path:
- '**/details_harness|drop|3_2023-10-26T14-26-37.860045.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T14-26-37.860045.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T14_26_37.860045
path:
- '**/details_harness|gsm8k|5_2023-10-26T14-26-37.860045.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T14-26-37.860045.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hellaswag|10_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T05-07-37.970407.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T05-07-37.970407.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T14_26_37.860045
path:
- '**/details_harness|winogrande|5_2023-10-26T14-26-37.860045.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T14-26-37.860045.parquet'
- config_name: results
data_files:
- split: 2023_09_13T05_07_37.970407
path:
- results_2023-09-13T05-07-37.970407.parquet
- split: 2023_10_26T14_26_37.860045
path:
- results_2023-10-26T14-26-37.860045.parquet
- split: latest
path:
- results_2023-10-26T14-26-37.860045.parquet
---
# Dataset Card for Evaluation run of DevaMalla/llama_7b_lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama_7b_lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_lora](https://huggingface.co/DevaMalla/llama_7b_lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama_7b_lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T14:26:37.860045](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_lora/blob/main/results_2023-10-26T14-26-37.860045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118953,
"f1": 0.0611650587248323,
"f1_stderr": 0.0013990352489173911,
"acc": 0.3915240971461363,
"acc_stderr": 0.00940445989381676
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118953,
"f1": 0.0611650587248323,
"f1_stderr": 0.0013990352489173911
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.0062982217961795855
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Jing24/high_all_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 79675205
num_examples: 87599
download_size: 14372991
dataset_size: 79675205
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "high_all_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bensonlinnnnn/train | ---
license: unknown
---
|
q-future/Q-Instruct-DB | ---
license: apache-2.0
---
A Preview version of the **Q-Instruct** dataset. A technical report coming soon.
Usage: The dataset is converted to LLaVA format. To get the data, first download the `cleaned_labels.json`; then download and extract `q-instruct-images.tar`.
Modify the `--data_path` and `--image_folder` in LLaVA training scripts to train with this dataset.
Please cite our paper if the dataset is used:
```
@misc{wu2023qinstruct,
title={Q-Instruct: Improving Low-level Visual Abilities for Multi-modality Foundation Models},
author={Haoning Wu and Zicheng Zhang and Erli Zhang and Chaofeng Chen and Liang Liao and Annan Wang and Kaixin Xu and Chunyi Li and Jingwen Hou and Guangtao Zhai and Geng Xue and Wenxiu Sun and Qiong Yan and Weisi Lin},
year={2023},
eprint={2311.06783},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` |
open-llm-leaderboard/details_jphme__em_german_leo_mistral | ---
pretty_name: Evaluation run of jphme/em_german_leo_mistral
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jphme/em_german_leo_mistral](https://huggingface.co/jphme/em_german_leo_mistral)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jphme__em_german_leo_mistral\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T05:35:49.227572](https://huggingface.co/datasets/open-llm-leaderboard/details_jphme__em_german_leo_mistral/blob/main/results_2023-10-26T05-35-49.227572.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2305998322147651,\n\
\ \"em_stderr\": 0.004313653760724557,\n \"f1\": 0.2864733640939601,\n\
\ \"f1_stderr\": 0.004317447810452205,\n \"acc\": 0.3954548691248602,\n\
\ \"acc_stderr\": 0.009372608948757369\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2305998322147651,\n \"em_stderr\": 0.004313653760724557,\n\
\ \"f1\": 0.2864733640939601,\n \"f1_stderr\": 0.004317447810452205\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.056103108415466264,\n \
\ \"acc_stderr\": 0.00633866843132188\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.012406549466192858\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jphme/em_german_leo_mistral
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T05_35_49.227572
path:
- '**/details_harness|drop|3_2023-10-26T05-35-49.227572.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T05-35-49.227572.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T05_35_49.227572
path:
- '**/details_harness|gsm8k|5_2023-10-26T05-35-49.227572.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T05-35-49.227572.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T05_35_49.227572
path:
- '**/details_harness|winogrande|5_2023-10-26T05-35-49.227572.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T05-35-49.227572.parquet'
- config_name: results
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- results_2023-10-11T17-57-34.404631.parquet
- split: 2023_10_26T05_35_49.227572
path:
- results_2023-10-26T05-35-49.227572.parquet
- split: latest
path:
- results_2023-10-26T05-35-49.227572.parquet
---
# Dataset Card for Evaluation run of jphme/em_german_leo_mistral
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jphme/em_german_leo_mistral
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jphme/em_german_leo_mistral](https://huggingface.co/jphme/em_german_leo_mistral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jphme__em_german_leo_mistral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T05:35:49.227572](https://huggingface.co/datasets/open-llm-leaderboard/details_jphme__em_german_leo_mistral/blob/main/results_2023-10-26T05-35-49.227572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2305998322147651,
"em_stderr": 0.004313653760724557,
"f1": 0.2864733640939601,
"f1_stderr": 0.004317447810452205,
"acc": 0.3954548691248602,
"acc_stderr": 0.009372608948757369
},
"harness|drop|3": {
"em": 0.2305998322147651,
"em_stderr": 0.004313653760724557,
"f1": 0.2864733640939601,
"f1_stderr": 0.004317447810452205
},
"harness|gsm8k|5": {
"acc": 0.056103108415466264,
"acc_stderr": 0.00633866843132188
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.012406549466192858
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
davanstrien/fuego-20230308-091412-dacf78 | ---
tags:
- fuego
fuego:
id: 20230308-091412-dacf78
status: preparing
script: script
requirements_file: requirements.txt
space_id: davanstrien/fuego-20230308-091412-dacf78
space_hardware: cpu-basic
---
|
BangumiBase/kon | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of K-on!
This is the image base of bangumi K-ON!, we detected 51 characters, 8731 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1392 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 1467 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 248 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 48 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 326 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 38 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 934 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 147 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 130 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 117 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 38 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 19 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 26 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 1185 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 35 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 241 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 45 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 90 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 119 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 11 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 15 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 208 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 30 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 28 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 26 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 949 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 23 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 51 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 37 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 25 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 20 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 37 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 19 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 26 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 11 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 17 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 16 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 13 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 29 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 27 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 22 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 13 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 29 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 15 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 7 | [Download](44/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 45 | 10 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 11 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 11 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 6 | [Download](48/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 49 | 13 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 331 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
DopeorNope/new_instruct3 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: tag
dtype: string
splits:
- name: train
num_bytes: 402385461
num_examples: 98099
download_size: 199191857
dataset_size: 402385461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Fiacre/PV-system-expert-500 | ---
license: openrail
---
|
Arrivedercis/finreport-llama2-smallfull | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 42295794
num_examples: 184327
download_size: 21073062
dataset_size: 42295794
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "finreport-llama2-smallfull"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ethical-Lens/Tox100 | ---
license: apache-2.0
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/51c15c9a | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1334
dataset_size: 186
---
# Dataset Card for "51c15c9a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
diffusers/pokemon-llava-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 340870512.0
num_examples: 833
download_size: 340816661
dataset_size: 340870512.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pokemon-llava-captions"
This dataset is just [lambdalabs/pokemon-blip-captions](https://huggingface.co/datasets/lambdalabs/pokemon-blip-captions) but the captions come
from the [LLaVA model](https://github.com/haotian-liu/LLaVA).
Refer to [the notebook](https://colab.research.google.com/drive/1RlBcCHyLRcr3y-anInlpheqDJWK7Axth?usp=sharing) that generated this dataset. |
CyberHarem/fern_sousounofrieren | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Fern/フェルン (Sousou No Frieren)
This is the dataset of Fern/フェルン (Sousou No Frieren), containing 772 images and their tags.
The core tags of this character are `purple_hair, long_hair, blunt_bangs, purple_eyes, straight_hair, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 772 | 404.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fern_sousounofrieren/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 772 | 404.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fern_sousounofrieren/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1403 | 695.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fern_sousounofrieren/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fern_sousounofrieren',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, black_coat, closed_mouth, solo, upper_body, white_dress, expressionless, frilled_collar, looking_to_the_side, purple_pupils, brick_wall, black_robe |
| 1 | 7 |  |  |  |  |  | 1girl, black_coat, closed_mouth, expressionless, looking_at_viewer, solo, white_dress, indoors, upper_body, frilled_collar |
| 2 | 6 |  |  |  |  |  | 1girl, black_coat, black_robe, looking_at_viewer, solo, white_dress, closed_mouth, expressionless, long_sleeves, holding_staff |
| 3 | 11 |  |  |  |  |  | 1girl, black_coat, closed_mouth, outdoors, solo, tree, white_dress, forest, expressionless, frilled_collar, purple_pupils, upper_body, day |
| 4 | 9 |  |  |  |  |  | 1girl, black_coat, black_robe, full_body, holding_staff, long_sleeves, mage_staff, solo, standing, white_dress, outdoors, black_footwear, closed_mouth, boots, long_dress, purple_pupils, looking_at_viewer, sky |
| 5 | 9 |  |  |  |  |  | 1girl, closed_mouth, forest, outdoors, solo, tree, expressionless, looking_at_viewer, frilled_collar, dress, upper_body, portrait |
| 6 | 6 |  |  |  |  |  | 2girls, black_coat, closed_mouth, expressionless, frilled_collar, solo_focus, white_dress, long_sleeves, looking_at_viewer |
| 7 | 7 |  |  |  |  |  | 1girl, from_side, hair_ornament, half_updo, profile, solo, upper_body, black_coat, brick_wall, closed_mouth, frilled_collar, dress, black_robe |
| 8 | 10 |  |  |  |  |  | 1girl, closed_mouth, purple_scarf, solo, coat, upper_body, expressionless, looking_at_viewer, purple_pupils, blue_scarf, indoors |
| 9 | 7 |  |  |  |  |  | 1girl, day, outdoors, purple_pupils, solo, blue_sky, closed_mouth, coat, purple_scarf, blue_scarf, looking_at_viewer, upper_body, tree, expressionless, winter_clothes |
| 10 | 12 |  |  |  |  |  | 1girl, hair_ornament, profile, solo, from_side, purple_scarf, closed_mouth, blue_scarf, looking_ahead, blurry_background, coat, indoors |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_coat | closed_mouth | solo | upper_body | white_dress | expressionless | frilled_collar | looking_to_the_side | purple_pupils | brick_wall | black_robe | looking_at_viewer | indoors | long_sleeves | holding_staff | outdoors | tree | forest | day | full_body | mage_staff | standing | black_footwear | boots | long_dress | sky | dress | portrait | 2girls | solo_focus | from_side | hair_ornament | half_updo | profile | purple_scarf | coat | blue_scarf | blue_sky | winter_clothes | looking_ahead | blurry_background |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------------|:---------------|:-------|:-------------|:--------------|:-----------------|:-----------------|:----------------------|:----------------|:-------------|:-------------|:--------------------|:----------|:---------------|:----------------|:-----------|:-------|:---------|:------|:------------|:-------------|:-----------|:-----------------|:--------|:-------------|:------|:--------|:-----------|:---------|:-------------|:------------|:----------------|:------------|:----------|:---------------|:-------|:-------------|:-----------|:-----------------|:----------------|:--------------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | X | | | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | X | | X | | | | X | | X | X | | X | X | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | X | X | | X | X | | | | | X | | | | X | X | X | | | | | | | | | X | X | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | | X | X | | | X | X | X | | | | | X | | X | | | | | | | | | | | | | | | X | X | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | X | X | | | X | | | X | X | | | | | | | | | | | | | | | | X | | | | X | X | X | X | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | | X | X | X | | X | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | |
| 9 | 7 |  |  |  |  |  | X | | X | X | X | | X | | | X | | | X | | | | X | X | | X | | | | | | | | | | | | | | | | X | X | X | X | X | | |
| 10 | 12 |  |  |  |  |  | X | | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | | X | X | X | X | | | X | X |
|
Nadav/pixel_glue_mrpc_noisy_ocr | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 4834675
num_examples: 18340
- name: validation
num_bytes: 108404
num_examples: 408
download_size: 3040861
dataset_size: 4943079
---
# Dataset Card for "pixel_glue_mrpc_noisy_ocr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LmaoMonkey/Garen | ---
license: openrail
---
|
dwadden/covidfact_entailment | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-nc-2.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- fact-checking
pretty_name: CovidFact
dataset_info:
features:
- name: claim_id
dtype: int32
- name: claim
dtype: string
- name: abstract_id
dtype: int32
- name: title
dtype: string
- name: abstract
sequence: string
- name: verdict
dtype: string
- name: evidence
sequence: int32
splits:
- name: train
num_bytes: 1547185
num_examples: 940
- name: test
num_bytes: 523542
num_examples: 317
download_size: 3610222
dataset_size: 2070727
---
# Dataset Card for "covidfact_entailment"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
## Dataset Description
- **Repository:** <https://github.com/asaakyan/covidfact>
- **Point of Contact:** [David Wadden](mailto:davidw@allenai.org)
### Dataset Summary
COVID-FACT is a dataset of claims about COVID-19. For this version of the dataset, we follow the preprocessing from the MultiVerS modeling paper https://github.com/dwadden/multivers, verifying claims against abstracts of scientific research articles. Entailment labels and rationales are included.
## Dataset Structure
### Data fields
- `claim_id`: An `int32` claim identifier.
- `claim`: A `string`.
- `abstract_id`: An `int32` abstract identifier.
- `title`: A `string`.
- `abstract`: A list of `strings`, one for each sentence in the abstract.
- `verdict`: The fact-checking verdict, a `string`.
- `evidence`: A list of sentences from the abstract which provide evidence for the verdict.
|
jan-hq/bpo_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 24175876
num_examples: 14195
- name: test
num_bytes: 286944
num_examples: 200
download_size: 13481262
dataset_size: 24462820
---
# Dataset Card for "bpo_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PritamStudent/my-platypus | ---
dataset_info:
features:
- name: input
struct:
- name: __index_level_0__
dtype: int64
- name: input
dtype: string
- name: output
struct:
- name: __index_level_0__
dtype: int64
- name: output
dtype: string
- name: instruction
struct:
- name: __index_level_0__
dtype: int64
- name: instruction
dtype: string
- name: data_source
struct:
- name: __index_level_0__
dtype: int64
- name: data_source
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 138763
num_examples: 52
download_size: 80710
dataset_size: 138763
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
roborovski/synthetic-toolformer-dpo | ---
dataset_info:
features:
- name: question
dtype: string
- name: call_result_accepted
dtype: string
- name: tool_call_accepted
dtype: string
- name: agent_output_accepted
dtype: string
- name: call_result_rejected
dtype: string
- name: tool_call_rejected
dtype: string
- name: agent_output_rejected
dtype: string
splits:
- name: train
num_bytes: 848138
num_examples: 2709
download_size: 39960
dataset_size: 848138
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-57500 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 5100260594
num_examples: 1000
download_size: 1101893040
dataset_size: 5100260594
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mayflowergmbh/alpaca-gpt4_de | ---
task_categories:
- text-generation
language:
- de
---
A reformatted version of the [FreedomIntelligence/alpaca-gpt4-deutsch](https://huggingface.co/datasets/FreedomIntelligence/alpaca-gpt4-deutsch) dataset.
Extracted from [seedboxventures/multitask_german_examples_32k](https://huggingface.co/datasets/seedboxventures/multitask_german_examples_32k).
Translation created by [seedbox ai](https://huggingface.co/seedboxai) for [KafkaLM](https://huggingface.co/seedboxai/KafkaLM-70B-German-V0.1) ❤️.
Available for finetuning in [hiyouga/LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory). |
OpenNLPLab/wikitext-103 | ---
license: apache-2.0
---
|
tiro-is/ismus | ---
dataset_info:
features:
- name: audio_id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: normalized_text
dtype: string
splits:
- name: train
num_bytes: 15107936585.61
num_examples: 109511
- name: test
num_bytes: 947114213.608
num_examples: 3184
download_size: 16411953840
dataset_size: 16055050799.218
---
# Dataset Card for "ismus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Reise/aitest | ---
license: apache-2.0
---
|
kuwacom/Character-Dataset | ---
license: mit
pretty_name: d
---
# What Is This
これは [RVC-Models](https://huggingface.co/kuwacom/RVC-Models) のモデルの学習に利用したデータセットの生データです。
利用する際は学習環境にあったデータに変換して利用してください。
# About Dataset
## yukkuri
ゆっくりムービーメーカー4でChatGPTを利用して生成した日本語の文章を約100分作成した動画です。 |
NAB1108/StockNews2 | ---
license: openrail
task_categories:
- text-classification
size_categories:
- n<1K
--- |
atutej/fact_probing | ---
dataset_info:
- config_name: en
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Options
sequence: string
splits:
- name: test
num_bytes: 9543
num_examples: 95
download_size: 6578
dataset_size: 9543
- config_name: fa
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Options
sequence: string
splits:
- name: test
num_bytes: 13227
num_examples: 95
download_size: 7249
dataset_size: 13227
- config_name: hi
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Options
sequence: string
splits:
- name: test
num_bytes: 20197
num_examples: 95
download_size: 8651
dataset_size: 20197
- config_name: kn
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Options
sequence: string
splits:
- name: test
num_bytes: 20034
num_examples: 95
download_size: 8829
dataset_size: 20034
- config_name: tr
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Options
sequence: string
splits:
- name: test
num_bytes: 9407
num_examples: 95
download_size: 6585
dataset_size: 9407
- config_name: ur
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Options
sequence: string
splits:
- name: test
num_bytes: 13459
num_examples: 95
download_size: 7446
dataset_size: 13459
configs:
- config_name: en
data_files:
- split: test
path: en/test-*
- config_name: fa
data_files:
- split: test
path: fa/test-*
- config_name: hi
data_files:
- split: test
path: hi/test-*
- config_name: kn
data_files:
- split: test
path: kn/test-*
- config_name: tr
data_files:
- split: test
path: tr/test-*
- config_name: ur
data_files:
- split: test
path: ur/test-*
---
|
senhorsapo/joaofrango | ---
license: openrail
---
|
Back-up/chung-khoan-demo-p10 | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 124015552
num_examples: 27388
download_size: 43232425
dataset_size: 124015552
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
indicbench/hellaswag_kn | ---
dataset_info:
features:
- name: ind
dtype: int64
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 32204205
num_examples: 10042
- name: test
num_bytes: 30989633
num_examples: 10003
download_size: 21416347
dataset_size: 63193838
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
mirzaei2114/stackoverflowVQA | ---
dataset_info:
features:
- name: Id
dtype: int64
- name: PostTypeId
dtype: int64
- name: AcceptedAnswerId
dtype: int64
- name: ParentId
dtype: int64
- name: Score
dtype: int64
- name: ViewCount
dtype: int64
- name: Body
dtype: string
- name: Title
dtype: string
- name: ContentLicense
dtype: string
- name: FavoriteCount
dtype: int64
- name: CreationDate
dtype: string
- name: LastActivityDate
dtype: string
- name: LastEditDate
dtype: string
- name: LastEditorUserId
dtype: int64
- name: OwnerUserId
dtype: int64
- name: Tags
sequence: string
splits:
- name: train
num_bytes: 1600405639.5087392
num_examples: 1250664
download_size: 767605805
dataset_size: 1600405639.5087392
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- visual-question-answering
- question-answering
language:
- en
tags:
- code
pretty_name: StackOverflowVQA
size_categories:
- 1M<n<10M
---
# Dataset Card for "stackoverflowVQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nateraw/lung-cancer | ---
kaggle_id: nancyalaswad90/lung-cancer
license:
- cc-by-nc-sa-4.0
---
# Dataset Card for Lung Cancer
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://kaggle.com/datasets/nancyalaswad90/lung-cancer
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The effectiveness of cancer prediction system helps the people to know their cancer risk with low cost and it also helps the people to take the appropriate decision based on their cancer risk status. The data is collected from the website online lung cancer prediction system .
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by [@nancyalaswad90](https://kaggle.com/nancyalaswad90)
### Licensing Information
The license for this dataset is cc-by-nc-sa-4.0
### Citation Information
```bibtex
[More Information Needed]
```
### Contributions
[More Information Needed] |
SJTU-TES/Graph-Match | ---
license: apache-2.0
---
|
AdapterOcean/python3-standardized_cluster_17 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 30276309
num_examples: 2728
download_size: 6581597
dataset_size: 30276309
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_17"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/23611323 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 166
num_examples: 10
download_size: 1336
dataset_size: 166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "23611323"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zliu333/truck_at_port2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 52167825.0
num_examples: 35
download_size: 52160219
dataset_size: 52167825.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/b8331a6c | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1330
dataset_size: 182
---
# Dataset Card for "b8331a6c"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fmagot01/giga_speech_all_preprocessed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': People and Blogs
'1': Business
'2': Nonprofits and Activism
'3': Crime
'4': History
'5': Pets and Animals
'6': News and Politics
'7': Travel and Events
'8': Kids and Family
'9': Leisure
'10': N/A
'11': Comedy
'12': News and Politics
'13': Sports
'14': Arts
'15': Science and Technology
'16': Autos and Vehicles
'17': Science and Technology
'18': People and Blogs
'19': Music
'20': Society and Culture
'21': Education
'22': Howto and Style
'23': Film and Animation
'24': Gaming
'25': Entertainment
'26': Travel and Events
'27': Health and Fitness
'28': audiobook
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 4132688160
num_examples: 8450
- name: test
num_bytes: 477405104
num_examples: 939
download_size: 2122088027
dataset_size: 4610093264
---
# Dataset Card for "giga_speech_all_preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jubba/camouflaged_animals | ---
language:
- en
license:
- mit
pretty_name: Object Detection for Camouflaged Animals
size_categories:
- n<1K
source_datasets: []
task_categories:
- object-detection
task_ids: []
---
# Camouflaged animals in natural images
## Dataset overview
Small dataset containing natural images with animals in camouflage hiding in them. The images were scrapped from google searches
and then manually annotated by a human with bounding boxes. The image difficulty varies from easily spotted animals to those requiring several seconds or minutes to identify.
### Easy image

### Hard image (snow leopard bottom right)

## Details
- 640x640 resolution
- COCO v1 annotation file
- Bounding boxes for one class: "animal"
- v1 includes 386 images
## TODO
- Add more data
- Add more specific classes
- Include segmentations |
liuyanchen1015/mnli_IndE | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 75917760
num_examples: 383924
- name: dev_matched
num_bytes: 1900229
num_examples: 9770
- name: dev_mismatched
num_bytes: 2016344
num_examples: 9824
- name: test_matched
num_bytes: 1896266
num_examples: 9672
- name: test_mismatched
num_bytes: 2021206
num_examples: 9841
download_size: 56783020
dataset_size: 83751805
---
# Dataset Card for "mnli_IndE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_184 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 707825968
num_examples: 137924
download_size: 715316892
dataset_size: 707825968
---
# Dataset Card for "chunk_184"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Crystalcareai/Self-Discover-MM-Instruct-openai | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 51628393.10650887
num_examples: 9311
- name: test_sft
num_bytes: 2722536.8934911243
num_examples: 491
- name: train_gen
num_bytes: 18154020
num_examples: 9311
- name: test_gen
num_bytes: 967510
num_examples: 491
download_size: 39826465
dataset_size: 73472460.0
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
|
xaviviro/common_voice_es_16_1_accent | ---
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
splits:
- name: train
num_bytes: 10306742828.244589
num_examples: 236042
- name: test
num_bytes: 104674798.82142359
num_examples: 2752
download_size: 8755162997
dataset_size: 10411417627.066013
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Falah/logo_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 271034
num_examples: 1000
download_size: 34969
dataset_size: 271034
---
# Dataset Card for "logo_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gaizerick/belle | ---
license: openrail
---
|
boun-tabi/squad_tr | ---
language:
- tr
license: cc-by-nc-nd-4.0
annotations_creators:
- machine-generated
language_creators:
- machine-generated
multilinguality:
- monolingual
pretty_name: SQuAD-TR
size_categories:
- 100K<n<1M
source_datasets:
- extended|squad
task_categories:
- question-answering
task_ids:
- open-domain-qa
- extractive-qa
paperswithcode_id: squad-tr
dataset_info:
- config_name: default
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 95795325
num_examples: 104791
- name: validation
num_bytes: 8287109
num_examples: 8291
download_size: 9425486
dataset_size: 104082434
- config_name: excluded
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
splits:
- name: train
num_bytes: 24130226
num_examples: 25528
- name: validation
num_bytes: 3427513
num_examples: 3582
download_size: 5270628
dataset_size: 27557739
- config_name: openqa
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
splits:
- name: train
num_bytes: 119261215
num_examples: 130319
- name: validation
num_bytes: 11649046
num_examples: 11873
download_size: 14696114
dataset_size: 130910261
---
# Dataset Card for SQuAD-TR
## Table of Contents
- [SQuAD-TR](#dataset-summary)
- [Dataset Description](#dataset-description)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## 📜 SQuAD-TR
SQuAD-TR is a machine translated version of the original [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/) dataset into Turkish, using [Amazon Translate](https://aws.amazon.com/translate/).
### Dataset Description
- **Repository:** [SQuAD-TR GitHub Repository](https://github.com/boun-tabi/SQuAD2.0-TR)
- **Paper:** Building Efficient and Effective OpenQA Systems for Low-Resource Languages
- **Point of Contact:** [Emrah Budur](mailto:emrah.budur@boun.edu.tr)
## Dataset Structure
### Data Instances
Our data instances follow that of the original SQuAD2.0 dataset.
Shared below is an example instance from the default train dataset🍫
Example from SQuAD2.0:
```
{
"context": "Chocolate is New York City's leading specialty-food export, with up to US$234 million worth of exports each year. Entrepreneurs were forming a \"Chocolate District\" in Brooklyn as of 2014, while Godiva, one of the world's largest chocolatiers, continues to be headquartered in Manhattan.",
"qas": [
{
"id": "56cff221234ae51400d9c140",
"question": "Which one of the world's largest chocolate makers is stationed in Manhattan?",
"is_impossible": false,
"answers": [
{
"text": "Godiva",
"answer_start": 194
}
],
}
]
}
```
Turkish translation:
```
{
"context": "Çikolata, her yıl 234 milyon ABD dolarına varan ihracatı ile New York'un önde gelen özel gıda ihracatıdır. Girişimciler 2014 yılı itibariyle Brooklyn'de bir “Çikolata Bölgesi” kurarken, dünyanın en büyük çikolatacılarından biri olan Godiva merkezi Manhattan'da olmaya devam ediyor.",
"qas": [
{
"id": "56cff221234ae51400d9c140",
"question": "Dünyanın en büyük çikolata üreticilerinden hangisi Manhattan'da konuşlandırılmış?",
"is_impossible": false,
"answers": [
{
"text": "Godiva",
"answer_start": 233
}
]
}
]
}
```
### Data Fields
Below if the data model of the splits.
- `id`: a string feature.
- `title`: a string feature.
- `context`: a string feature.
- `question`: a string feature.
- `answers`: a dictionary feature containing:
- `text`: a string feature.
- `*answer_start`: a int32 feature.
*Notes:
- The training split we get by `openqa` parameter will not include `answer_start` field as it is not required for the training phase of the OpenQA formulation.
- The split we get by `excluded` parameter is also missing `answer_start` field as we could not identify the starting index of the answers for these examples from the context after the translation.
## Dataset Creation
We translated the titles, context paragraphs, questions and answer spans from the original SQuAD2.0 dataset using [Amazon Translate](https://aws.amazon.com/translate/) - requiring us to remap the starting positions of the answer spans, since their positions were changed due to the automatic translation.
We performed an automatic post-processing step to populate the start positions for the answer spans. To do so, we have first looked at whether there was an exact match for the translated answer span in the translated context paragraph and if so, we kept the answer text along with this start position found.
If no exact match was found, we looked for approximate matches using a character-level edit distance algorithm.
We have excluded the question-answer pairs from the original dataset where neither an exact nor an approximate match was found in the translated version. Our `default` configuration corresponds to this version.
We have put the excluded examples in our `excluded` configuration.
As a result, the datasets in these two configurations are mutually exclusive. Below are the details for the corresponding dataset splits.
### Data Splits
The SQuAD2.0 TR dataset has 2 splits: _train_ and _validation_. Below are the statistics for the most recent version of the dataset in the default configuration.
| Split | Articles | Paragraphs | Answerable Questions | Unanswerable Questions | Total |
| ---------- | -------- | ---------- | -------------------- | ---------------------- | ------- |
| train | 442 | 18776 | 61293 | 43498 | 104,791 |
| validation | 35 | 1204 | 2346 | 5945 | 8291 |
| Split | Articles | Paragraphs | Questions wo/ answers | Total |
| ------- | -------- | ---------- | --------------------- | ------- |
| train-excluded | 440 | 13490 | 25528 | 25528 |
| dev-excluded | 35 | 924 | 3582 | 3582 |
In addition to the default configuration, we also a different view of train split can be obtained specifically for openqa setting by combining the `train` and `train-excluded` splits. In this view, we only have question-answer pairs (without `answer_start` field) along with their contexts.
| Split | Articles | Paragraphs | Questions w/ answers | Total |
| ---------- | -------- | ---------- | -------------------- | ------- |
| openqa | 442 | 18776 | 86821 | 86821 |
More information on our translation strategy can be found in our linked paper.
### Source Data
This dataset used the original SQuAD2.0 dataset as its source data.
### Licensing Information
The SQuAD-TR is released under [CC BY-NC-ND 4.0](https://creativecommons.org/licenses/by-nc-nd/4.0).
#### 🤗 HuggingFace datasets
```py
from datasets import load_dataset
squad_tr_standard_qa = load_dataset("[TBD]", "default")
squad_tr_open_qa = load_dataset("[TBD]", "openqa")
squad_tr_excluded = load_dataset("[TBD]", "excluded")
xquad_tr = load_dataset("xquad", "xquad.tr") # External resource
```
* Demo application 👉 [Google Colab](https://colab.research.google.com/drive/1QVD0c1kFfOUc1sRGKDHWeF_HgNEineRt?usp=sharing).
### 🔬 Reproducibility
You can find all code, models and samples of the input data here [link TBD]. Please feel free to reach out to us if you have any specific questions.
### ✍️ Citation
>[Emrah Budur](https://scholar.google.com/citations?user=zSNd03UAAAAJ), [Rıza Özçelik](https://www.cmpe.boun.edu.tr/~riza.ozcelik), [Dilara Soylu](https://scholar.google.com/citations?user=_NC2jJEAAAAJ), [Omar Khattab](https://omarkhattab.com), [Tunga Güngör](https://www.cmpe.boun.edu.tr/~gungort/) and [Christopher Potts](https://web.stanford.edu/~cgpotts).
Building Efficient and Effective OpenQA Systems for Low-Resource Languages. 2024.
```
@misc{budur-etal-2024-squad-tr,
title={Building Efficient and Effective OpenQA Systems for Low-Resource Languages},
author={Emrah Budur and R{\i}za \"{O}z\c{c}elik and Dilara Soylu and Omar Khattab and Tunga G\"{u}ng\"{o}r and Christopher Potts},
year={2024},
eprint={TBD},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## ❤ Acknowledgment
This research was supported by the _[AWS Cloud Credits for Research Program](https://aws.amazon.com/government-education/research-and-technical-computing/cloud-credit-for-research/) (formerly AWS Research Grants)_.
We thank Alara Dirik, Almira Bağlar, Berfu Büyüköz, Berna Erden, Gökçe Uludoğan, Havva Yüksel, Melih Barsbey, Murat Karademir, Selen Parlar, Tuğçe Ulutuğ, Utku Yavuz for their support on our application for AWS Cloud Credits for Research Program and Fatih Mehmet Güler for the valuable advice, discussion and insightful comments. |
open-llm-leaderboard/details_yunconglong__13B_MATH_DPO | ---
pretty_name: Evaluation run of yunconglong/13B_MATH_DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yunconglong/13B_MATH_DPO](https://huggingface.co/yunconglong/13B_MATH_DPO) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__13B_MATH_DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T13:57:51.859773](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__13B_MATH_DPO/blob/main/results_2024-01-28T13-57-51.859773.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6520951734159913,\n\
\ \"acc_stderr\": 0.0320896342101235,\n \"acc_norm\": 0.6512213582132556,\n\
\ \"acc_norm_stderr\": 0.032773142500047994,\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720137,\n \"mc2\": 0.786270657367768,\n\
\ \"mc2_stderr\": 0.013770581751355523\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7201365187713311,\n \"acc_stderr\": 0.013119040897725922,\n\
\ \"acc_norm\": 0.7465870307167235,\n \"acc_norm_stderr\": 0.012710896778378606\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7255526787492531,\n\
\ \"acc_stderr\": 0.0044532337261103455,\n \"acc_norm\": 0.8951404102768373,\n\
\ \"acc_norm_stderr\": 0.0030574627544411952\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n\
\ \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n\
\ \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.02536060379624256,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.02536060379624256\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083135,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083135\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720137,\n \"mc2\": 0.786270657367768,\n\
\ \"mc2_stderr\": 0.013770581751355523\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8808208366219415,\n \"acc_stderr\": 0.009105988620006186\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6709628506444276,\n \
\ \"acc_stderr\": 0.012942375603679383\n }\n}\n```"
repo_url: https://huggingface.co/yunconglong/13B_MATH_DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|arc:challenge|25_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|gsm8k|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hellaswag|10_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-57-51.859773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T13-57-51.859773.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- '**/details_harness|winogrande|5_2024-01-28T13-57-51.859773.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T13-57-51.859773.parquet'
- config_name: results
data_files:
- split: 2024_01_28T13_57_51.859773
path:
- results_2024-01-28T13-57-51.859773.parquet
- split: latest
path:
- results_2024-01-28T13-57-51.859773.parquet
---
# Dataset Card for Evaluation run of yunconglong/13B_MATH_DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/13B_MATH_DPO](https://huggingface.co/yunconglong/13B_MATH_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__13B_MATH_DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T13:57:51.859773](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__13B_MATH_DPO/blob/main/results_2024-01-28T13-57-51.859773.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6520951734159913,
"acc_stderr": 0.0320896342101235,
"acc_norm": 0.6512213582132556,
"acc_norm_stderr": 0.032773142500047994,
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720137,
"mc2": 0.786270657367768,
"mc2_stderr": 0.013770581751355523
},
"harness|arc:challenge|25": {
"acc": 0.7201365187713311,
"acc_stderr": 0.013119040897725922,
"acc_norm": 0.7465870307167235,
"acc_norm_stderr": 0.012710896778378606
},
"harness|hellaswag|10": {
"acc": 0.7255526787492531,
"acc_stderr": 0.0044532337261103455,
"acc_norm": 0.8951404102768373,
"acc_norm_stderr": 0.0030574627544411952
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188723,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188723
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608306,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.02536060379624256,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.02536060379624256
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083135,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720137,
"mc2": 0.786270657367768,
"mc2_stderr": 0.013770581751355523
},
"harness|winogrande|5": {
"acc": 0.8808208366219415,
"acc_stderr": 0.009105988620006186
},
"harness|gsm8k|5": {
"acc": 0.6709628506444276,
"acc_stderr": 0.012942375603679383
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anan-2024/twitter_dataset_1713145931 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 69221
num_examples: 182
download_size: 40938
dataset_size: 69221
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sin3142/memes-1500 | ---
task_categories:
- image-classification
size_categories:
- 1K<n<10K
--- |
tasksource/oasst1_dense_flat | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int32
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int32
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int32
- name: name
sequence: string
- name: value
sequence: float64
- name: parent_text
dtype: string
- name: spam
dtype: float64
- name: fails_task
dtype: float64
- name: lang_mismatch
dtype: float64
- name: pii
dtype: float64
- name: not_appropriate
dtype: float64
- name: hate_speech
dtype: float64
- name: sexual_content
dtype: float64
- name: quality
dtype: float64
- name: toxicity
dtype: float64
- name: humor
dtype: float64
- name: helpfulness
dtype: float64
- name: creativity
dtype: float64
- name: violence
dtype: float64
splits:
- name: train
num_bytes: 59657796
num_examples: 34059
- name: validation
num_bytes: 3164029
num_examples: 1816
download_size: 25173939
dataset_size: 62821825
license: apache-2.0
---
# Dataset Card for "oasst1_dense_flat"
[OASST1 dataset](https://huggingface.co/datasets/OpenAssistant/oasst1)
But where with retrieved parent_text, and where we only keep messages with dense annotations (all labels have 2 annotators)
```python
from datasets import Dataset, DatasetDict
d={}
for split in ['train','validation']:
df=load_dataset("OpenAssistant/oasst1")[split].to_pandas()
m2t=df.set_index("message_id")['text'].to_dict()
df['parent_text']=df.parent_id.map(lambda x: m2t.get(x,''))
df=df[df.labels.map(lambda x:x!=None)]
df=df[df.labels.map(lambda x:x['count'].min()>2)]
labels=df.labels.map(lambda x:list(x['name'])).value_counts().index[0]
df=df[df.labels.map(lambda x:x!=None)]
df=df[df.labels.map(lambda x:list(x['name'])==labels)]
for label in labels:
df[label]=df.labels.map(lambda x: x['value'][list(x['name']).index(label)])
d[split]=Dataset.from_pandas(df,preserve_index=False)
DatasetDict(d).push_to_hub('oasst1_dense_flat')
```
https://github.com/LAION-AI/Open-Assistant
```
@article{kopf2023openassistant,
title={OpenAssistant Conversations--Democratizing Large Language Model Alignment},
author={K{\"o}pf, Andreas and Kilcher, Yannic and von R{\"u}tte, Dimitri and Anagnostidis, Sotiris and Tam, Zhi-Rui and Stevens, Keith and Barhoum, Abdullah and Duc, Nguyen Minh and Stanley, Oliver and Nagyfi, Rich{\'a}rd and others},
journal={arXiv preprint arXiv:2304.07327},
year={2023}
}
``` |
aengusl/noise0_alpaca_sleeper_agents_toy_test_preference_v4 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 6727950
num_examples: 15662
download_size: 2591055
dataset_size: 6727950
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zhangshuoming/c_arm64 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4543075905
num_examples: 1041342
download_size: 1217380394
dataset_size: 4543075905
---
# Dataset Card for "c_arm64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
to-be/ghega_dataset_preprocessed | ---
license: openrail
---
|
chenglu/hf-blogs-baai-embeddings | ---
license: apache-2.0
---
|
mohit-raghavendra/self-instruct-wikipedia | ---
license: apache-2.0
dataset_info:
features:
- name: question
dtype: string
- name: query_terms
dtype: string
splits:
- name: train
num_bytes: 116267
num_examples: 1384
download_size: 82027
dataset_size: 116267
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Dataset Name
A Human+LLM annotated dataset of Wikipedia search terms. It is a collection of questions from the [Trivia QA](https://huggingface.co/datasets/trivia_qa) dataset, along with the terms to search for
in Wikipedia, that might help in answer the question.
The annotation was done by using the self-instruct format. A small subset of annotations were done by humans (the author) and were used as k-shot examples to feed into the
Gemini-Pro model to annotate the rest of the dataset.
## Dataset Details
### Dataset Description
- **Curated by:** [Mohit Raghavendra]
## Uses
This can be used to fine-tune an agent, given a question, to find the term to search for in Wikipedia.
## Dataset Creation
### Source Data
TriviaQA dataset - [https://huggingface.co/datasets/trivia_qa](https://huggingface.co/datasets/trivia_qa)
#### Data Collection and Processing
The data is a subsample of the TriviaQA dataset, specifically the first **1%** of the training split from the TriviaQA dataset.
```python
datasets.load_dataset("trivia_qa", "rc.nocontext", split="train[:1%]")
```
### Annotations [optional]
The first 30 examples in the dataset are annotated by the author.
These are then used as k-shot examples (k=10) to instruct the Gemini-Pro model to label the rest of the dataset.
The following system message was used to instruct the model, followed by examples:
```python
SYSTEM_MESSAGE = f"""There exists a wikipedia summarizer that can return a summary for a topic. \
Your job is to act as an aid to a question answering tool. Whenever you are asked about a question related to general knowledge, \
instead of using your internal knowledge (which can be faulty or out of date), \
format a Wikipedia search query string that can help answer the question. \
Wikipedia Entries are usually about a simple entity or event, so keep the \
query short, and about the entity being asked about. Also, don't use your knowledge \
to ask about the answer. Instead form queries about the entity in the question. This \
will help you get the right wikipedia entries for questions when you dont know the answer
"""
```
## Dataset Card Authors [optional]
Mohit Raghavendra
## Dataset Card Contact
[https://www.linkedin.com/in/mohit-r/](https://www.linkedin.com/in/mohit-r/)
|
open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4 | ---
pretty_name: Evaluation run of beaugogh/Llama2-7b-sharegpt4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beaugogh/Llama2-7b-sharegpt4](https://huggingface.co/beaugogh/Llama2-7b-sharegpt4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T12:02:42.509386](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4/blob/main/results_2023-10-13T12-02-42.509386.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.0004058451132417743,\n \"f1\": 0.06141988255033573,\n\
\ \"f1_stderr\": 0.0014263478827371335,\n \"acc\": 0.369226585159047,\n\
\ \"acc_stderr\": 0.008577465355756637\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417743,\n\
\ \"f1\": 0.06141988255033573,\n \"f1_stderr\": 0.0014263478827371335\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.026535253980288095,\n \
\ \"acc_stderr\": 0.00442704598726516\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7119179163378059,\n \"acc_stderr\": 0.012727884724248115\n\
\ }\n}\n```"
repo_url: https://huggingface.co/beaugogh/Llama2-7b-sharegpt4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T12_02_42.509386
path:
- '**/details_harness|drop|3_2023-10-13T12-02-42.509386.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T12-02-42.509386.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T12_02_42.509386
path:
- '**/details_harness|gsm8k|5_2023-10-13T12-02-42.509386.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T12-02-42.509386.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:50:59.260675.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T11:50:59.260675.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T12_02_42.509386
path:
- '**/details_harness|winogrande|5_2023-10-13T12-02-42.509386.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T12-02-42.509386.parquet'
- config_name: results
data_files:
- split: 2023_08_09T11_50_59.260675
path:
- results_2023-08-09T11:50:59.260675.parquet
- split: 2023_10_13T12_02_42.509386
path:
- results_2023-10-13T12-02-42.509386.parquet
- split: latest
path:
- results_2023-10-13T12-02-42.509386.parquet
---
# Dataset Card for Evaluation run of beaugogh/Llama2-7b-sharegpt4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beaugogh/Llama2-7b-sharegpt4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [beaugogh/Llama2-7b-sharegpt4](https://huggingface.co/beaugogh/Llama2-7b-sharegpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T12:02:42.509386](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-sharegpt4/blob/main/results_2023-10-13T12-02-42.509386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417743,
"f1": 0.06141988255033573,
"f1_stderr": 0.0014263478827371335,
"acc": 0.369226585159047,
"acc_stderr": 0.008577465355756637
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417743,
"f1": 0.06141988255033573,
"f1_stderr": 0.0014263478827371335
},
"harness|gsm8k|5": {
"acc": 0.026535253980288095,
"acc_stderr": 0.00442704598726516
},
"harness|winogrande|5": {
"acc": 0.7119179163378059,
"acc_stderr": 0.012727884724248115
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EmbeddingStudio/synthetic-search-filters-ru | ---
license: apache-2.0
dataset_info:
features:
- name: category
dtype: string
- name: category_description
dtype: string
- name: filter_name
dtype: string
- name: representation_name
dtype: string
- name: representation_type
dtype: string
- name: representation_enum
sequence: string
- name: representation_examples
sequence: string
- name: representation_pattern
dtype: string
splits:
- name: train_filters
num_bytes: 859114
num_examples: 2749
- name: test_filters
num_bytes: 1039548
num_examples: 3317
download_size: 314660
dataset_size: 1898662
configs:
- config_name: default
data_files:
- split: train_filters
path: data/train_filters-*
- split: test_filters
path: data/test_filters-*
task_categories:
- token-classification
- text-generation
language:
- ru
- en
pretty_name: 'Synthetic Search Filters : Russian'
size_categories:
- 1K<n<10K
tags:
- synthetic
- search-queries
- e-commerce
- online-shops
- travel-agencies
- educational-institutions-ai
- job-recruitment-automation
- banking-digital-services
- investment-ai-analysis
- insurance-tech-innovation
- financial-advisory-ai
- credit-services-automation
- payment-processing-tech
- mortgage-tech-solutions
- real-estate-digital-solutions
- taxation-tech-services
- risk-management-ai
- compliance-automation
- digital-banking-innovation
- mobile-banking-tech
- online-retail-tech
- offline-retail-automation
- automotive-dealership-tech
- restaurant-automation-tech
- food-delivery-ai
- entertainment-platforms-ai
- media-platforms-tech
- government-services-automation
- travel-tech-innovation
- consumer-analytics-ai
- logistics-tech-automation
- supply-chain-ai
- customer-support-tech
- market-research-ai
- mobile-app-dev-tech
- game-dev-ai
- cloud-computing-services
- data-analytics-ai
- business-intelligence-ai
- cybersecurity-software-tech
- ui-ux-design-ai
- iot-development-tech
- project-management-tools-ai
- version-control-systems-tech
- ci-cd-automation
- issue-tracking-ai
- bug-reporting-automation
- collaborative-dev-environments
- team-communication-tech
- task-time-management-ai
- customer-feedback-ai
- cloud-based-dev-tech
- image-stock-platforms-ai
- video-hosting-tech
- social-networks-ai
- professional-social-networks-ai
- dating-apps-tech
---
# Synthetic Search Filters
This is generated with GPT-4 Turbo possible search filters and theirs representations for the given business/service categories and for the Russian language domain:
```
Artificial Intelligence and Machine Learning, Automotive, Automotive Dealerships, Banking Services, Books and Media, Cloud Computing Services, Cloud-based Development Environments, Collaborative Development Environments, Commercial Real Estate, Continuous Integration/Continuous Deployment, Credit Services, Customer Support Services, Customer Support and Feedback, Cybersecurity Software, Data Analytics and Business Intelligence, Dating Apps, Digital and Mobile Banking, Documentation and Knowledge Sharing, E-commerce Platforms, Eco-Friendly and Sustainable Properties, Educational Institutions, Electronics, Enterprise Software Development, Entertainment and Media Platforms, Event Planning Services, Fashion and Apparel, Financial Planning and Advisory, Food and Grocery, Game Development, Government Services, Health and Beauty, Healthcare Providers, Home and Garden, Image Stock Platforms, Insurance Services, International Real Estate, Internet of Things (IoT) Development, Investment Services, Issue Tracking and Bug Reporting, Job Recruitment Agencies, Land Sales and Acquisitions, Legal Services, Logistics and Supply Chain Management, Luxury and High-End Properties, Market Research Firms, Mobile App Development, Mortgage and Real Estate Services, Payment Processing, Pet Supplies, Professional Social Networks, Project Management Tools, Property Management, Real Estate Consulting, Real Estate Development, Real Estate Investment, Residential Real Estate, Restaurants and Food Delivery Services, Retail Stores (Online and Offline), Risk Management and Compliance, Social Networks, Sports and Outdoors, Task and Time Management, Taxation Services, Team Communication and Chat Tools, Telecommunication Companies, Toys and Games, Travel and Booking Agencies, Travelers and Consumers, User Interface/User Experience Design, Version Control Systems, Video Hosting and Portals, Web Development
```
This is a parsed in the way each row is an unique pair filter - represantation version of [`EmbeddingStudio/synthetic-search-filters-ru-raw`](https://huggingface.co/datasets/EmbeddingStudio/synthetic-search-filters-ru-raw).
## Columns description
* category (type: Optional[str]) - business/service category name.
* category_description (type: Optional[str]) - longer description of business/service.
* filter_name (type: Optional[str]) - meaningful name of filter.
* representation_name (type: Optional[str]) - name of filter representation.
* representation_type (type: Optional[str]) - python-like type of representation value (str, int, float, bool)
* representation_enum (type: (Optional[List[str]])) - is represntation is an enumertation, this is a list of possible values.
* representation_examples (type: List[Union[str, int, float]])) - exmaples of expected representation values.
* representation_pattern (type: Optional[str]) - if representation is a pattern-like (e.g. `dd/mm/YYYY`), this is a pattern to follow.
## What are representations?
It's easier to understand with an exmaple. Imagine, you have a filter named `Rating`, so it can be represented as:
* Integer or float value in 1-5 scale
* Integer or float value in 1-10 scale
* Integer or float value in 1-100 scale
* As the enumeration with values (*, **, ***, ****, *****)
* As the enumeration with values (bad, medium, good, the best)
## Train / test splitting principles
As we are trying to fine-tune LLM to follow zero-shot query parsing instructions, so we want to test:
* Ability to work well with unseen domain
* Ability to work well with unseen filters
* Ability to work well with unseen queries
For these purposes we:
1. We put into test split 5 categories, completely separared from train: Automotive, Educational Institutions, Enterprise Software Development, Payment Processing, Professional Social Networks.
2. Also out of each appearing in train company categories, we put aside / removed one filter and queries related to it.
# How to use it
```python
from datasets import load_dataset
filters_dataset = load_dataset("EmbeddingStudio/synthetic-search-filters-ru")
```
Embedding Studio team uses this filters to [generate queries and theirs parsed version](EmbeddingStudio/query-parsing-instructions-saiga) for [IlyaGusev/saiga_mistral_7b_lora](https://huggingface.co/IlyaGusev/saiga_mistral_7b_lora) [fine-tuning to follow Zero-Shot search queries parsing instructions](https://huggingface.co/EmbeddingStudio/query-parser-saiga-mistral-7b-lora). |
alokps/hf-github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
dtype: int64
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 12862975
num_examples: 4000
download_size: 3165624
dataset_size: 12862975
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hf-github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.