datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Ransaka/Sinhala-400M | ---
dataset_info:
features:
- name: text
sequence: string
splits:
- name: train
num_bytes: 2802808058.089643
num_examples: 8854185
- name: test
num_bytes: 1201203543.9103568
num_examples: 3794651
download_size: 1826451430
dataset_size: 4004011602
license: apache-2.0
task_categories:
- text-generation
- feature-extraction
language:
- si
pretty_name: Sinhala Large Scale Corpus
size_categories:
- 10M<n<100M
---
# Dataset Card for "Sinhala-400M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlp-brin-id/unsupervised_title-fact | ---
license: apache-2.0
task_categories:
- feature-extraction
language:
- id
size_categories:
- 10K<n<100K
--- |
tyzhu/wiki_find_passage_train10_eval10_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 22558
num_examples: 30
- name: validation
num_bytes: 6982
num_examples: 10
download_size: 25018
dataset_size: 29540
---
# Dataset Card for "wiki_find_passage_train10_eval10_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deetsadi/processed_cdi_sobel | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 19391453.0
num_examples: 200
download_size: 0
dataset_size: 19391453.0
---
# Dataset Card for "processed_cdi_sobel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Prgckwb/jiro-style-ramen | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 978393.0
num_examples: 31
download_size: 978665
dataset_size: 978393.0
---
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/VALUE_cola_been_done | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4170
num_examples: 45
- name: test
num_bytes: 5169
num_examples: 60
- name: train
num_bytes: 51879
num_examples: 627
download_size: 33431
dataset_size: 61218
---
# Dataset Card for "VALUE_cola_been_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thefluxapp/dsum | ---
dataset_info:
features:
- name: dialogue
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 55615012.0
num_examples: 54383
download_size: 32278282
dataset_size: 55615012.0
---
# Dataset Card for "dsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allenai/cochrane_sparse_oracle | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-MS^2
- extended|other-Cochrane
task_categories:
- summarization
- text2text-generation
paperswithcode_id: multi-document-summarization
pretty_name: MSLR Shared Task
---
This is a copy of the [Cochrane](https://huggingface.co/datasets/allenai/mslr2022) dataset, except the input source documents of its `validation` split have been replaced by a __sparse__ retriever. The retrieval pipeline used:
- __query__: The `target` field of each example
- __corpus__: The union of all documents in the `train`, `validation` and `test` splits. A document is the concatenation of the `title` and `abstract`.
- __retriever__: BM25 via [PyTerrier](https://pyterrier.readthedocs.io/en/latest/) with default settings
- __top-k strategy__: `"oracle"`, i.e. the number of documents retrieved, `k`, is set as the original number of input documents for each example
Retrieval results on the `train` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.7014 | 0.3841 | 0.3841 | 0.3841 |
Retrieval results on the `validation` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.7226 | 0.4023 | 0.4023 | 0.4023 |
Retrieval results on the `test` set:
N/A. Test set is blind so we do not have any queries. |
CyberHarem/clara_starrail | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of clara/クラーラ/克拉拉/클라라 (Honkai: Star Rail)
This is the dataset of clara/クラーラ/克拉拉/클라라 (Honkai: Star Rail), containing 177 images and their tags.
The core tags of this character are `long_hair, bangs, white_hair, red_eyes, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 177 | 294.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 177 | 149.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 443 | 326.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 177 | 251.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 443 | 482.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/clara_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, simple_background, long_sleeves, solo, upper_body, white_background, blush, closed_mouth, red_jacket, hair_intakes, sweater, coat |
| 1 | 8 |  |  |  |  |  | 1girl, barefoot, blush, long_sleeves, looking_at_viewer, solo, toes, coat, soles, bare_legs, simple_background, sitting, thigh_strap, white_background, full_body, pink_eyes, red_jacket, very_long_hair, closed_mouth, foot_focus, foreshortening, underwear |
| 2 | 6 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, blush, closed_mouth, coat, simple_background, smile, solo, white_background, barefoot, pink_eyes, white_dress, full_body |
| 3 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, closed_mouth, collarbone, medium_breasts, navel, pink_eyes, nipples, pussy, stomach, thighs, arms_behind_back, completely_nude, indoors, mosaic_censoring, cowboy_shot, on_back, pillow, plant, purple_eyes, small_breasts, standing, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | simple_background | long_sleeves | solo | upper_body | white_background | blush | closed_mouth | red_jacket | hair_intakes | sweater | coat | barefoot | toes | soles | bare_legs | sitting | thigh_strap | full_body | pink_eyes | very_long_hair | foot_focus | foreshortening | underwear | smile | white_dress | collarbone | medium_breasts | navel | nipples | pussy | stomach | thighs | arms_behind_back | completely_nude | indoors | mosaic_censoring | cowboy_shot | on_back | pillow | plant | purple_eyes | small_breasts | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:---------------|:-------|:-------------|:-------------------|:--------|:---------------|:-------------|:---------------|:----------|:-------|:-----------|:-------|:--------|:------------|:----------|:--------------|:------------|:------------|:-----------------|:-------------|:-----------------|:------------|:--------|:--------------|:-------------|:-----------------|:--------|:----------|:--------|:----------|:---------|:-------------------|:------------------|:----------|:-------------------|:--------------|:----------|:---------|:--------|:--------------|:----------------|:-----------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | | X | X | | | | | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | | X | | | X | X | | | | | | | | | | | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_wnli_adj_postfix | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4467
num_examples: 21
- name: test
num_bytes: 25683
num_examples: 91
- name: train
num_bytes: 37127
num_examples: 173
download_size: 30769
dataset_size: 67277
---
# Dataset Card for "MULTI_VALUE_wnli_adj_postfix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sambhavi/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FMunyoz/AMB | ---
license: cc
---
|
Livingwithmachines/hmd-erwt-training | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- machine-generated
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: Dataset Card for ERWT Hertiage Made Digital Newspapers training data
size_categories:
- 100K<n<1M
source_datasets: []
tags:
- library,lam,newspapers,1800-1900
task_categories:
- fill-mask
task_ids:
- masked-language-modeling
---
# Dataset Card for ERWT Hertiage Made Digital Newspapers training data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains text extracted at the page level from historic digitised newspapers from the [Heritage Made Digital](https://bl.iro.bl.uk/collections/9a6a4cdd-2bfe-47bb-8c14-c0a5d100501f?locale=en) newspaper digitisation program. The newspapers in the dataset were published between 1800 and 1870.
The data was primarily created as a dataset for training 'time-aware' language models.
The dataset contains text generated from Optical Character Recognition software on digitised newspaper pages. This dataset includes the plain text from the OCR alongside some minimal metadata associated with the newspaper from which the text is derived and OCR confidence score information generated from the OCR software.
#### Breakdown of word counts over time
Whilst the dataset covers a time period between 1800 and 1870, the number of words in the dataset is not distributed evenly across time in this dataset. The figures below give a sense of the breakdown over time in terms of the number of words which appear in the dataset.
| year | total word_count | unique words |
|-------:|-------------------:|---------------:|
| 1800 | 282,554,255 | 15,506,515 |
| 1810 | 328,817,174 | 18,295,974 |
| 1820 | 328,817,174 | 18,295,974 |
| 1830 | 194,958,624 | 10,816,938 |
| 1840 | 305,545,086 | 17,018,560 |
| 1850 | 376,194,785 | 20,942,876 |
| 1860 | 305,545,086 | 17,018,560 |
| 1870 | 51,241,037 | 2,284,803 |

### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases

[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
Sleoruiz/discursos-septima-class-separated-by-idx | ---
dataset_info:
features:
- name: text
dtype: string
- name: name
dtype: string
- name: comision
dtype: string
- name: gaceta_numero
dtype: string
- name: fecha_gaceta
dtype: string
- name: labels
sequence: string
- name: scores
sequence: float64
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 22572969
num_examples: 15070
download_size: 10450492
dataset_size: 22572969
---
# Dataset Card for "discursos-septima-class-separated-by-idx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chirunder/GRE_all_text_word_freq | ---
dataset_info:
features:
- name: word
dtype: string
- name: frequency
dtype: int64
splits:
- name: train
num_bytes: 392007
num_examples: 19836
download_size: 224362
dataset_size: 392007
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GRE_all_text_word_freq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-virology-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 44531
num_examples: 166
download_size: 31956
dataset_size: 44531
---
# Dataset Card for "mmlu-virology-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
whooray/ko_Ultrafeedback_binarized | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: chosen_response
dtype: string
- name: rejected_response
dtype: string
splits:
- name: train
num_bytes: 226278590
num_examples: 61966
download_size: 110043082
dataset_size: 226278590
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Fork of https://huggingface.co/datasets/maywell/ko_Ultrafeedback_binarized.
just change column name to use for axolotl. all credits goes to maywell |
mnoukhov/summarize_from_feedback_tldr_3_filtered_oai_preprocessing_1706381144 | ---
dataset_info:
features:
- name: id
dtype: string
- name: subreddit
dtype: string
- name: title
dtype: string
- name: post
dtype: string
- name: summary
dtype: string
- name: query_token
sequence: int64
- name: query
dtype: string
- name: reference_response
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
- name: query_reference_response
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_response_label
sequence: int64
- name: query_reference_response_token_len
dtype: int64
- name: has_comparison
dtype: bool
splits:
- name: train
num_bytes: 2125703840
num_examples: 116722
- name: validation
num_bytes: 117438077
num_examples: 6447
- name: test
num_bytes: 119411786
num_examples: 6553
download_size: 561795675
dataset_size: 2362553703
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
codeparrot/codecomplex | ---
annotations_creators: []
language_creators:
- expert-generated
language:
- code
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids:
- language-modeling
pretty_name: CodeComplex
---
# CodeComplex Dataset
## Dataset Description
[CodeComplex](https://github.com/yonsei-toc/CodeComple) consists of 4,200 Java codes submitted to programming competitions by human programmers and their complexity labels annotated by a group of algorithm experts.
### How to use it
You can load and iterate through the dataset with the following two lines of code:
```python
from datasets import load_dataset
ds = load_dataset("codeparrot/codecomplex", split="train")
print(next(iter(ds)))
```
## Data Structure
```
DatasetDict({
train: Dataset({
features: ['src', 'complexity', 'problem', 'from'],
num_rows: 4517
})
})
```
### Data Instances
```python
{'src': 'import java.io.*;\nimport java.math.BigInteger;\nimport java.util.InputMismatchException;...',
'complexity': 'quadratic',
'problem': '1179_B. Tolik and His Uncle',
'from': 'CODEFORCES'}
```
### Data Fields
* src: a string feature, representing the source code in Java.
* complexity: a string feature, giving program complexity.
* problem: a string of the feature, representing the problem name.
* from: a string feature, representing the source of the problem.
complexity filed has 7 classes, where each class has around 500 codes each. The seven classes are constant, linear, quadratic, cubic, log(n), nlog(n) and NP-hard.
### Data Splits
The dataset only contains a train split.
## Dataset Creation
The authors first collected problem and solution codes in Java from CodeForces and they were inspected by experienced human annotators to label each code by their time complexity. After the labelling, they used different programming experts to verify the class of each data that the human annotators assigned.
## Citation Information
```
@article{JeonBHHK22,
author = {Mingi Jeon and Seung-Yeop Baik and Joonghyuk Hahn and Yo-Sub Han and Sang-Ki Ko},
title = {{Deep Learning-based Code Complexity Prediction}},
year = {2022},
}
``` |
InceptiveDev/CoverLetterProV1dataset | ---
license: mit
---
|
mikhail-panzo/processed_malay_dataset_micro | ---
dataset_info:
features:
- name: speaker_embeddings
sequence: float32
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
splits:
- name: train
num_bytes: 352177204
num_examples: 3000
download_size: 350656621
dataset_size: 352177204
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
petr7555/street2shop | ---
dataset_info:
features:
- name: type
dtype: string
- name: category
dtype: string
- name: street_photo_id
dtype: int32
- name: product_id
dtype: int32
- name: width
dtype: float32
- name: top
dtype: float32
- name: height
dtype: float32
- name: left
dtype: float32
- name: shop_photo_id
dtype: int32
- name: street_photo_url
dtype: string
- name: shop_photo_url
dtype: string
- name: street_photo_image
dtype: image
- name: shop_photo_image
dtype: image
splits:
- name: test
num_bytes: 20990773602.627
num_examples: 27357
- name: train
num_bytes: 82180129067.717
num_examples: 97437
download_size: 43403838962
dataset_size: 103170902670.344
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo2_100_kl_0.1_prm_70m_thr_0.3_seed_1 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43597508
num_examples: 18929
- name: epoch_1
num_bytes: 44140361
num_examples: 18929
- name: epoch_2
num_bytes: 44219218
num_examples: 18929
- name: epoch_3
num_bytes: 44272701
num_examples: 18929
- name: epoch_4
num_bytes: 44303217
num_examples: 18929
- name: epoch_5
num_bytes: 44318370
num_examples: 18929
- name: epoch_6
num_bytes: 44329713
num_examples: 18929
- name: epoch_7
num_bytes: 44334298
num_examples: 18929
- name: epoch_8
num_bytes: 44338166
num_examples: 18929
- name: epoch_9
num_bytes: 44339871
num_examples: 18929
- name: epoch_10
num_bytes: 44340020
num_examples: 18929
- name: epoch_11
num_bytes: 44340799
num_examples: 18929
- name: epoch_12
num_bytes: 44342396
num_examples: 18929
- name: epoch_13
num_bytes: 44343629
num_examples: 18929
- name: epoch_14
num_bytes: 44343512
num_examples: 18929
- name: epoch_15
num_bytes: 44343176
num_examples: 18929
- name: epoch_16
num_bytes: 44342483
num_examples: 18929
- name: epoch_17
num_bytes: 44344000
num_examples: 18929
- name: epoch_18
num_bytes: 44342859
num_examples: 18929
- name: epoch_19
num_bytes: 44343164
num_examples: 18929
- name: epoch_20
num_bytes: 44343829
num_examples: 18929
- name: epoch_21
num_bytes: 44344365
num_examples: 18929
- name: epoch_22
num_bytes: 44344011
num_examples: 18929
- name: epoch_23
num_bytes: 44346128
num_examples: 18929
- name: epoch_24
num_bytes: 44344476
num_examples: 18929
- name: epoch_25
num_bytes: 44344911
num_examples: 18929
- name: epoch_26
num_bytes: 44345157
num_examples: 18929
- name: epoch_27
num_bytes: 44345020
num_examples: 18929
- name: epoch_28
num_bytes: 44344510
num_examples: 18929
- name: epoch_29
num_bytes: 44344390
num_examples: 18929
download_size: 699957730
dataset_size: 1329066258
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
kenhktsui/wiki_dpr_e5 | ---
license: cc-by-sa-3.0
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: title
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 78346298059.0
num_examples: 21015300
download_size: 3792584904
dataset_size: 78346298059.0
---
`wiki_dpr` encoded with `intfloat/e5-base-v2` |
nutorbit/news-headline-gen | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: headline
dtype: string
- name: news
dtype: string
splits:
- name: train
num_bytes: 23555772
num_examples: 21157
- name: dev
num_bytes: 2628111
num_examples: 2365
download_size: 17404158
dataset_size: 26183883
---
# Dataset Card for "news-headline-gen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jyshbgde/cinescopeDataset | ---
license: openrail
task_categories:
- feature-extraction
language:
- en
pretty_name: cinescope
---
|
alansun25/cs375_cv11_mandarin_test | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 761405859
num_examples: 1000
download_size: 566068077
dataset_size: 761405859
---
# Dataset Card for "cs375_cv11_mandarin_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChirathD/dpt-testing-version-1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3135193.0
num_examples: 5
download_size: 3136751
dataset_size: 3135193.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dpt-testing-version-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nvidia/OpenMath-GSM8K-masked | ---
license: other
license_name: nvidia-license
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- math
- nvidia
pretty_name: OpenMath GSM8K Masked
size_categories:
- 1K<n<10K
---
# OpenMath GSM8K Masked
We release a *masked* version of the [GSM8K](https://github.com/openai/grade-school-math) solutions.
This data can be used to aid synthetic generation of additional solutions for GSM8K dataset
as it is much less likely to lead to inconsistent reasoning compared to using
the original solutions directly.
This dataset was used to construct [OpenMathInstruct-1](https://huggingface.co/datasets/nvidia/OpenMathInstruct-1):
a math instruction tuning dataset with 1.8M problem-solution pairs
generated using permissively licensed [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) model.
For details of how the masked solutions were created, see our [paper](https://arxiv.org/abs/2402.10176).
You can re-create this dataset or apply similar techniques to mask solutions for other datasets
by using our [open-sourced code](https://github.com/Kipok/NeMo-Skills).
## Citation
If you find our work useful, please consider citing us!
```bibtex
@article{toshniwal2024openmath,
title = {OpenMathInstruct-1: A 1.8 Million Math Instruction Tuning Dataset},
author = {Shubham Toshniwal and Ivan Moshkov and Sean Narenthiran and Daria Gitman and Fei Jia and Igor Gitman},
year = {2024},
journal = {arXiv preprint arXiv: Arxiv-2402.10176}
}
```
## License
The use of this dataset is governed by the [NVIDIA License](LICENSE) which permits commercial usage.
|
jarrydmartinx/recount3-RNA-seq | ---
license: gpl
---
|
open-llm-leaderboard/details_jisukim8873__mistralai-case-0-1 | ---
pretty_name: Evaluation run of jisukim8873/mistralai-case-0-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jisukim8873/mistralai-case-0-1](https://huggingface.co/jisukim8873/mistralai-case-0-1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jisukim8873__mistralai-case-0-1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T05:14:39.214242](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__mistralai-case-0-1/blob/main/results_2024-03-22T05-14-39.214242.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6245526964592384,\n\
\ \"acc_stderr\": 0.032657544114946147,\n \"acc_norm\": 0.6303693247434217,\n\
\ \"acc_norm_stderr\": 0.033326929053029704,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.41430745876476355,\n\
\ \"mc2_stderr\": 0.014409498973913385\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326023,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938213\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6313483369846644,\n\
\ \"acc_stderr\": 0.004814532642574651,\n \"acc_norm\": 0.8305118502290381,\n\
\ \"acc_norm_stderr\": 0.003744157442536556\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7419354838709677,\n \"acc_stderr\": 0.02489246917246283,\n \"\
acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.02489246917246283\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072388,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072388\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045804,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035293,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035293\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879713,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.014987325439963554,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.014987325439963554\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493272,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493272\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.41430745876476355,\n\
\ \"mc2_stderr\": 0.014409498973913385\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223188\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3464746019711903,\n \
\ \"acc_stderr\": 0.01310717905431338\n }\n}\n```"
repo_url: https://huggingface.co/jisukim8873/mistralai-case-0-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|arc:challenge|25_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|gsm8k|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hellaswag|10_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T05-14-39.214242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T05-14-39.214242.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- '**/details_harness|winogrande|5_2024-03-22T05-14-39.214242.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T05-14-39.214242.parquet'
- config_name: results
data_files:
- split: 2024_03_22T05_14_39.214242
path:
- results_2024-03-22T05-14-39.214242.parquet
- split: latest
path:
- results_2024-03-22T05-14-39.214242.parquet
---
# Dataset Card for Evaluation run of jisukim8873/mistralai-case-0-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jisukim8873/mistralai-case-0-1](https://huggingface.co/jisukim8873/mistralai-case-0-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jisukim8873__mistralai-case-0-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T05:14:39.214242](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__mistralai-case-0-1/blob/main/results_2024-03-22T05-14-39.214242.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6245526964592384,
"acc_stderr": 0.032657544114946147,
"acc_norm": 0.6303693247434217,
"acc_norm_stderr": 0.033326929053029704,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.41430745876476355,
"mc2_stderr": 0.014409498973913385
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326023,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938213
},
"harness|hellaswag|10": {
"acc": 0.6313483369846644,
"acc_stderr": 0.004814532642574651,
"acc_norm": 0.8305118502290381,
"acc_norm_stderr": 0.003744157442536556
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072388,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072388
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045804,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035293,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035293
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879713,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.014987325439963554,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.014987325439963554
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493272,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493272
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.41430745876476355,
"mc2_stderr": 0.014409498973913385
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223188
},
"harness|gsm8k|5": {
"acc": 0.3464746019711903,
"acc_stderr": 0.01310717905431338
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_mnli_double_superlative | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 73217
num_examples: 290
- name: dev_mismatched
num_bytes: 69152
num_examples: 277
- name: test_matched
num_bytes: 89060
num_examples: 350
- name: test_mismatched
num_bytes: 69882
num_examples: 282
- name: train
num_bytes: 3225807
num_examples: 12917
download_size: 2134967
dataset_size: 3527118
---
# Dataset Card for "MULTI_VALUE_mnli_double_superlative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rmanluo/RoG-webqsp | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer
sequence: string
- name: q_entity
sequence: string
- name: a_entity
sequence: string
- name: graph
sequence:
sequence: string
- name: choices
sequence: 'null'
splits:
- name: train
num_bytes: 993540472
num_examples: 2826
- name: validation
num_bytes: 84009553
num_examples: 246
- name: test
num_bytes: 580788090
num_examples: 1628
download_size: 0
dataset_size: 1658338115
---
# Dataset Card for "RoG-webqsp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atom92/medical_healthwa | ---
license: cc
---
|
cannlytics/cannabis_licenses | ---
pretty_name: cannabis_licenses
annotations_creators:
- expert-generated
language_creators:
- expert-generated
license:
- cc-by-4.0
tags:
- cannabis
- licenses
---
# Cannabis Licenses
<!-- FIXME:
<div align="center" style="text-align:center; margin-top:1rem; margin-bottom: 1rem;">
<img style="max-height:365px;width:100%;max-width:720px;" alt="" src="analysis/figures/cannabis-licenses-map.png">
</div> -->
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Data Collection and Normalization](#data-collection-and-normalization)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [License](#license)
- [Citation](#citation)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** <https://github.com/cannlytics/cannlytics>
- **Repository:** <https://huggingface.co/datasets/cannlytics/cannabis_licenses>
- **Point of Contact:** <dev@cannlytics.com>
### Dataset Summary
**Cannabis Licenses** is a collection of cannabis license data for each state with permitted adult-use cannabis. The dataset also includes a sub-dataset, `all`, that includes all licenses.
## Dataset Structure
The dataset is partitioned into 18 subsets for each state and the aggregate.
| State | Code | Status |
|-------|------|--------|
| [All](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/all) | `all` | ✅ |
| [Alaska](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/ak) | `ak` | ✅ |
| [Arizona](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/az) | `az` | ✅ |
| [California](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/ca) | `ca` | ✅ |
| [Colorado](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/co) | `co` | ✅ |
| [Connecticut](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/ct) | `ct` | ✅ |
| [Delaware](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/de) | `md` | ✅ |
| [Illinois](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/il) | `il` | ✅ |
| [Maine](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/me) | `me` | ✅ |
| [Maryland](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/md) | `md` | ✅ |
| [Massachusetts](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/ma) | `ma` | ✅ |
| [Michigan](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/mi) | `mi` | ✅ |
| [Missouri](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/mo) | `mo` | ✅ |
| [Montana](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/mt) | `mt` | ✅ |
| [Nevada](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/nv) | `nv` | ✅ |
| [New Jersey](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/nj) | `nj` | ✅ |
| [New Mexico](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/nm) | `nm` | ✅ |
| [New York](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/ny) | `ny` | ✅ |
| [Oregon](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/or) | `or` | ✅ |
| [Rhode Island](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/ri) | `ri` | ✅ |
| [Vermont](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/vt) | `vt` | ✅ |
| Virginia | `va` | ⏳ Expected 2024 |
| [Washington](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/wa) | `wa` | ✅ |
The following states have issued medical cannabis licenses, but are not (yet) included in the dataset:
- Alabama
- Arkansas
- District of Columbia (D.C.)
- Florida
- Kentucky (2024)
- Louisiana
- Minnesota
- Mississippi
- New Hampshire
- North Dakota
- Ohio
- Oklahoma
- Pennsylvania
- South Dakota
- Utah
- West Virginia
### Data Instances
You can load the licenses for each state. For example:
```py
from datasets import load_dataset
# Get the licenses for a specific state.
dataset = load_dataset('cannlytics/cannabis_licenses', 'all')
data = dataset['data']
```
### Data Fields
Below is a non-exhaustive list of fields, used to standardize the various data that are encountered, that you may expect to find for each observation.
| Field | Example | Description |
|-------|-----|-------------|
| `id` | `"1046"` | A state-unique ID for the license. |
| `license_number` | `"C10-0000423-LIC"` | A unique license number. |
| `license_status` | `"Active"` | The status of the license. Only licenses that are active are included. |
| `license_status_date` | `"2022-04-20T00:00"` | The date the status was assigned, an ISO-formatted date if present. |
| `license_term` | `"Provisional"` | The term for the license. |
| `license_type` | `"Commercial - Retailer"` | The type of business license. |
| `license_designation` | `"Adult-Use and Medicinal"` | A state-specific classification for the license. |
| `issue_date` | `"2019-07-15T00:00:00"` | An issue date for the license, an ISO-formatted date if present. |
| `expiration_date` | `"2023-07-14T00:00:00"` | An expiration date for the license, an ISO-formatted date if present. |
| `licensing_authority_id` | `"BCC"` | A unique ID for the state licensing authority. |
| `licensing_authority` | `"Bureau of Cannabis Control (BCC)"` | The state licensing authority. |
| `business_legal_name` | `"Movocan"` | The legal name of the business that owns the license. |
| `business_dba_name` | `"Movocan"` | The name the license is doing business as. |
| `business_owner_name` | `"redacted"` | The name of the owner of the license. |
| `business_structure` | `"Corporation"` | The structure of the business that owns the license. |
| `activity` | `"Pending Inspection"` | Any relevant license activity. |
| `premise_street_address` | `"1632 Gateway Rd"` | The street address of the business. |
| `premise_city` | `"Calexico"` | The city of the business. |
| `premise_state` | `"CA"` | The state abbreviation of the business. |
| `premise_county` | `"Imperial"` | The county of the business. |
| `premise_zip_code` | `"92231"` | The zip code of the business. |
| `business_email` | `"redacted@gmail.com"` | The business email of the license. |
| `business_phone` | `"(555) 555-5555"` | The business phone of the license. |
| `business_website` | `"cannlytics.com"` | The business website of the license. |
| `parcel_number` | `"A42"` | An ID for the business location. |
| `premise_latitude` | `32.69035693` | The latitude of the business. |
| `premise_longitude` | `-115.38987552` | The longitude of the business. |
| `data_refreshed_date` | `"2022-09-21T12:16:33.3866667"` | An ISO-formatted time when the license data was updated. |
### Data Splits
The data is split into subsets by state. You can retrieve all licenses by requesting the `all` subset.
```py
from datasets import load_dataset
# Get all cannabis licenses.
dataset = load_dataset('cannlytics/cannabis_licenses', 'all')
data = dataset['data']
```
## Dataset Creation
### Curation Rationale
Data about organizations operating in the cannabis industry for each state is valuable for research.
### Source Data
| State | Data Source URL |
|-------|-----------------|
| Alaska | <https://www.commerce.alaska.gov/abc/marijuana/Home/licensesearch> |
| Arizona | <https://azcarecheck.azdhs.gov/s/?licenseType=null> |
| California | <https://search.cannabis.ca.gov/> |
| Colorado | <https://sbg.colorado.gov/med/licensed-facilities> |
| Connecticut | <https://portal.ct.gov/DCP/Medical-Marijuana-Program/Connecticut-Medical-Marijuana-Dispensary-Facilities> |
| Delaware | <https://dhss.delaware.gov/dhss/dph/hsp/medmarcc.html> |
| Illinois | <https://www.idfpr.com/LicenseLookup/AdultUseDispensaries.pdf> |
| Maine | <https://www.maine.gov/dafs/ocp/open-data/adult-use> |
| Maryland | <https://mmcc.maryland.gov/Pages/Dispensaries.aspx> |
| Massachusetts | <https://masscannabiscontrol.com/open-data/data-catalog/> |
| Michigan | <https://michigan.maps.arcgis.com/apps/webappviewer/index.html?id=cd5a1a76daaf470b823a382691c0ff60> |
| Missouri | <https://health.mo.gov/safety/cannabis/licensed-facilities.php> |
| Montana | <https://mtrevenue.gov/cannabis/#CannabisLicenses> |
| Nevada | <https://ccb.nv.gov/list-of-licensees/> |
| New Jersey | <https://data.nj.gov/stories/s/ggm4-mprw> |
| New Mexico | <https://nmrldlpi.force.com/bcd/s/public-search-license?division=CCD&language=en_US> |
| New York | <https://cannabis.ny.gov/licensing> |
| Oregon | <https://www.oregon.gov/olcc/marijuana/pages/recreational-marijuana-licensing.aspx> |
| Rhode Island | <https://dbr.ri.gov/office-cannabis-regulation/compassion-centers/licensed-compassion-centers> |
| Vermont | <https://ccb.vermont.gov/licenses> |
| Washington | <https://lcb.wa.gov/records/frequently-requested-lists> |
### Data Collection and Normalization
In the `algorithms` directory, you can find the algorithms used for data collection. You can use these algorithms to recreate the dataset. First, you will need to clone the repository:
```
git clone https://huggingface.co/datasets/cannlytics/cannabis_licenses
```
You can then install the algorithm Python (3.9+) requirements:
```
cd cannabis_licenses
pip install -r requirements.txt
```
Then you can run all of the data-collection algorithms:
```
python algorithms/main.py
```
Or you can run each algorithm individually. For example:
```
python algorithms/get_licenses_ny.py
```
### Personal and Sensitive Information
This dataset includes names of individuals, public addresses, and contact information for cannabis licensees. It is important to take care to use these data points in a legal manner.
## Considerations for Using the Data
### Social Impact of Dataset
Arguably, there is substantial social impact that could result from the study of permitted adult-use cannabis, therefore, researchers and data consumers alike should take the utmost care in the use of this dataset.
### Discussion of Biases
Cannlytics is a for-profit data and analytics company that primarily serves cannabis businesses. The data are not randomly collected and thus sampling bias should be taken into consideration.
### Other Known Limitations
The data is for adult-use cannabis licenses. It would be valuable to include medical cannabis licenses too.
## Additional Information
### Dataset Curators
Curated by [🔥Cannlytics](https://cannlytics.com)<br>
<contact@cannlytics.com>
### License
```
Copyright (c) 2022-2023 Cannlytics and the Cannabis Data Science Team
The files associated with this dataset are licensed under a
Creative Commons Attribution 4.0 International license.
You can share, copy and modify this dataset so long as you give
appropriate credit, provide a link to the CC BY license, and
indicate if changes were made, but you may not do so in a way
that suggests the rights holder has endorsed you or your use of
the dataset. Note that further permission may be required for
any content within the dataset that is identified as belonging
to a third party.
```
### Citation
Please cite the following if you use the code examples in your research:
```bibtex
@misc{cannlytics2023,
title={Cannabis Data Science},
author={Skeate, Keegan and O'Sullivan-Sutherland, Candace},
journal={https://github.com/cannlytics/cannabis-data-science},
year={2023}
}
```
### Contributions
Thanks to [🔥Cannlytics](https://cannlytics.com), [@candy-o](https://github.com/candy-o), [@hcadeaux](https://huggingface.co/hcadeaux), [@keeganskeate](https://github.com/keeganskeate), and the entire [Cannabis Data Science Team](https://meetup.com/cannabis-data-science/members) for their contributions.
|
Saturo1234567/Gojo23 | ---
license: openrail
---
|
ManpreetK/NDD_NER | ---
viewer: true
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': I-CONDITION
'1': I-TEST
'2': B-CONDITION
'3': I-PATIENT_GROUP
'4': B-ASSOCIATED_PROBLEM
'5': O
'6': I-ASSOCIATED_PROBLEM
'7': B-INTERVENTION
'8': B-PATIENT_GROUP
'9': I-INTERVENTION
'10': B-TEST
splits:
- name: train
num_bytes: 156151
num_examples: 341
- name: validation
num_bytes: 68495
num_examples: 177
- name: test
num_bytes: 67949
num_examples: 160
download_size: 78315
dataset_size: 292595
---
# Dataset Card for "NDD_NER"
## Dataset Summary
This Named Entity Recognition dataset is created for Neurodevelopmental disorders domain to detected domain specific entities. Initially, pubmed abstracts were annotated
with SciSpaCy UMLS entity linker and specific semantic types were mapped to required domain specific labels, which were further validated during manual curation process
using Label Studio (an open source data labeling tool).
| Label Category | UMLS semantic types |
|-----|-----|
|CONDITION| Mental or Behavioral Dysfunction, Disease or Syndrome, Neoplastic Process, Congenital Abnormality |
|ASSOCIATED_PROBLEM| Sign or Symptom, Mental Process, Injury or Poisoning |
|PATIENT_GROUP| Age Group, Population Group, Patient or Disabled Group |
|INTERVENTION| Therapeutic or Preventive Procedure, Health Care Activity |
|TEST| Diagnostic Procedure, Intellectual Product, Research Activity, Laboratory Procedure |
## Dataset Splits
|split name|number of examples|CONDITION|ASSOCIATED_PROBLEM|PATIENT_GROUP|INTERVENTION|TEST|
|-----|-----|-----|-----|-----|-----|-----|
|train| 341 | 320 | 189 | 240 | 273 | 228 |
|test| 160 | 139 | 68 | 87 | 98 | 82 |
|validation| 177 | 147 | 82 | 104 | 117 | 98 |
## Source Data
Pubmed abstracts for ("Neurodevelopmental Disorders"[Mesh]) AND "Behavioral Disciplines and Activities"[Mesh] query using NCBI E-utilities API.
|
matejklemen/vuamc | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- other
multilinguality:
- monolingual
pretty_name: VUA Metaphor Corpus
size_categories:
- 10K<n<100K
- 100K<n<1M
source_datasets: []
tags:
- metaphor-classification
- multiword-expression-detection
- vua20
- vua18
- mipvu
task_categories:
- text-classification
- token-classification
task_ids:
- multi-class-classification
---
# Dataset Card for VUA Metaphor Corpus
**Important note#1**: This is a slightly simplified but mostly complete parse of the corpus. What is missing are lemmas and some metadata that was not important at the time of writing the parser. See the section `Simplifications` for more information on this.
**Important note#2**: The dataset contains metadata - to ignore it and correctly remap the annotations, see the section `Discarding metadata`.
### Dataset Summary
VUA Metaphor Corpus (VUAMC) contains a selection of excerpts from BNC-Baby files that have been annotated for metaphor. There are four registers, each comprising about 50 000 words: academic texts, news texts, fiction, and conversations.
Words have been separately labelled as participating in multi-word expressions (about 1.5%) or as discarded for metaphor analysis (0.02%). Main categories include words that are related to metaphor (MRW), words that signal metaphor (MFlag), and words that are not related to metaphor. For metaphor-related words, subdivisions have been made between clear cases of metaphor versus borderline cases (WIDLII, When In Doubt, Leave It In). Another parameter of metaphor-related words makes a distinction between direct metaphor, indirect metaphor, and implicit metaphor.
### Supported Tasks and Leaderboards
Metaphor detection, metaphor type classification.
### Languages
English.
## Dataset Structure
### Data Instances
A sample instance from the dataset:
```
{
'document_name': 'kcv-fragment42',
'words': ['', 'I', 'think', 'we', 'should', 'have', 'different', 'holidays', '.'],
'pos_tags': ['N/A', 'PNP', 'VVB', 'PNP', 'VM0', 'VHI', 'AJ0', 'NN2', 'PUN'],
'met_type': [
{'type': 'mrw/met', 'word_indices': [5]}
],
'meta': ['vocal/laugh', 'N/A', 'N/A', 'N/A', 'N/A', 'N/A', 'N/A', 'N/A', 'N/A']
}
```
### Data Fields
The instances are ordered as they appear in the corpus.
- `document_name`: a string containing the name of the document in which the sentence appears;
- `words`: words in the sentence (`""` when the word represents metadata);
- `pos_tags`: POS tags of the words, encoded using the BNC basic tagset (`"N/A"` when the word does not have an associated POS tag);
- `met_type`: metaphors in the sentence, marked by their type and word indices;
- `meta`: selected metadata tags providing additional context to the sentence. Metadata may not correspond to a specific word. In this case, the metadata is represented with an empty string (`""`) in `words` and a `"N/A"` tag in `pos_tags`.
## Dataset Creation
For detailed information on the corpus, please check out the references in the `Citation Information` section or contact the dataset authors.
## Simplifications
The raw corpus is equipped with rich metadata and encoded in the TEI XML format. The textual part is fully parsed except for the lemmas, i.e. all the sentences in the raw corpus are present in the dataset.
However, parsing the metadata fully is unnecessarily tedious, so certain simplifications were made:
- paragraph information is not preserved as the dataset is parsed at sentence level;
- manual corrections (`<corr>`) of incorrectly written words are ignored, and the original, incorrect form of the words is used instead;
- `<ptr>` and `<anchor>` tags are ignored as I cannot figure out what they represent;
- the attributes `rendition` (in `<hi>` tags) and `new` (in `<shift>` tags) are not exposed.
## Discarding metadata
The dataset contains rich metadata, which is stored in the `meta` attribute. To keep data aligned, empty words or `"N/A"`s are inserted into the other attributes. If you want to ignore the metadata and correct the metaphor type annotations, you can use code similar to the following snippet:
```python3
data = datasets.load_dataset("matejklemen/vuamc")["train"]
data = data.to_pandas()
for idx_ex in range(data.shape[0]):
curr_ex = data.iloc[idx_ex]
idx_remap = {}
for idx_word, word in enumerate(curr_ex["words"]):
if len(word) != 0:
idx_remap[idx_word] = len(idx_remap)
# Note that lists are stored as np arrays by datasets, while we are storing new data in a list!
# (unhandled for simplicity)
words, pos_tags, met_type = curr_ex[["words", "pos_tags", "met_type"]].tolist()
if len(idx_remap) != len(curr_ex["words"]):
words = list(filter(lambda _word: len(_word) > 0, curr_ex["words"]))
pos_tags = list(filter(lambda _pos: _pos != "N/A", curr_ex["pos_tags"]))
met_type = []
for met_info in curr_ex["met_type"]:
met_type.append({
"type": met_info["type"],
"word_indices": list(map(lambda _i: idx_remap[_i], met_info["word_indices"]))
})
```
## Additional Information
### Dataset Curators
Gerard Steen; et al. (please see http://hdl.handle.net/20.500.12024/2541 for the full list).
### Licensing Information
Available for non-commercial use on condition that the terms of the [BNC Licence](http://www.natcorp.ox.ac.uk/docs/licence.html) are observed and that this header is included in its entirety with any copy distributed.
### Citation Information
```
@book{steen2010method,
title={A method for linguistic metaphor identification: From MIP to MIPVU},
author={Steen, Gerard and Dorst, Lettie and Herrmann, J. and Kaal, Anna and Krennmayr, Tina and Pasma, Trijntje},
volume={14},
year={2010},
publisher={John Benjamins Publishing}
}
```
```
@inproceedings{leong-etal-2020-report,
title = "A Report on the 2020 {VUA} and {TOEFL} Metaphor Detection Shared Task",
author = "Leong, Chee Wee (Ben) and
Beigman Klebanov, Beata and
Hamill, Chris and
Stemle, Egon and
Ubale, Rutuja and
Chen, Xianyang",
booktitle = "Proceedings of the Second Workshop on Figurative Language Processing",
year = "2020",
url = "https://aclanthology.org/2020.figlang-1.3",
doi = "10.18653/v1/2020.figlang-1.3",
pages = "18--29"
}
```
### Contributions
Thanks to [@matejklemen](https://github.com/matejklemen) for adding this dataset.
|
burtenshaw/function_calling_benchmark | ---
license: apache-2.0
---
|
hyuny0219/KorQuAD | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 48815283
num_examples: 102278
download_size: 29874541
dataset_size: 48815283
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LewisShanghai/autotrain-data-books-rating-analysis | ---
language:
- en
task_categories:
- text-classification
---
# AutoTrain Dataset for project: books-rating-analysis
## Dataset Description
This dataset has been automatically processed by AutoTrain for project books-rating-analysis.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"feat_Unnamed: 0": 1976,
"feat_user_id": "792500e85277fa7ada535de23e7eb4c3",
"feat_book_id": 18243288,
"feat_review_id": "7f8219233a62bde2973ddd118e8162e2",
"target": 2,
"text": "This book is kind of tricky. It is pleasingly written stylistically and it's an easy read so I cruised along on the momentum of the smooth prose and the potential of what this book could have and should have been for a while before I realized that it is hollow and aimless. \n This is a book where the extraordinary is deliberately made mundane for some reason and characters are stubbornly underdeveloped. It is as if all the drama has been removed from this story, leaving a bloodless collection of 19th industrial factoids sprinkled amidst a bunch of ciphers enduring an oddly dull series of tragedies. \n Mildly entertaining for a while but ultimately unsatisfactory.",
"feat_date_added": "Mon Apr 27 11:37:36 -0700 2015",
"feat_date_updated": "Mon May 04 08:50:42 -0700 2015",
"feat_read_at": "Mon May 04 08:50:42 -0700 2015",
"feat_started_at": "Mon Apr 27 00:00:00 -0700 2015",
"feat_n_votes": 0,
"feat_n_comments": 0
},
{
"feat_Unnamed: 0": 523,
"feat_user_id": "01ec1a320ffded6b2dd47833f2c8e4fb",
"feat_book_id": 18220354,
"feat_review_id": "c19543fab6b2386df92c1a9ba3cf6e6b",
"target": 4,
"text": "4.5 stars!! I am always intrigued to read a novel written from a male POV. I am equally fascinated by pen names, and even when the writer professes to be one gender or the other (or leaves it open to the imagination such as BG Harlen), I still wonder at the back of my mind whether the author is a male or female. Do some female writers have a decidedly masculine POV? Yes, there are several that come to mind. Do some male writers have a feminine \"flavor\" to their writing? It seems so. \n And so we come to the fascinating Thou Shalt Not. I loved Luke's story, as well as JJ Rossum's writing style, and don't want to be pigeon-holed into thinking that the author is male or female. That's just me. Either way, it's a very sexy and engaging book with plenty of steamy scenes to satisfy even the most jaded erotic romance reader (such as myself). The story carries some very weighty themes (domestic violence, adultery, the nature of beauty), but the book is very fast-paced and satisfying. Will Luke keep himself out of trouble with April? Will he learn to really love someone again? No spoilers here, but the author answers these questions while exploring what qualities are really important and what makes someone worthy of love. \n This book has a very interesting conclusion that some readers will love, and some might find a little challenging. I loved it and can't wait to read more from this author. \n *ARC provided by the author in exchange for an honest review.",
"feat_date_added": "Mon Jul 29 16:04:04 -0700 2013",
"feat_date_updated": "Thu Dec 12 21:43:54 -0800 2013",
"feat_read_at": "Fri Dec 06 00:00:00 -0800 2013",
"feat_started_at": "Thu Dec 05 00:00:00 -0800 2013",
"feat_n_votes": 10,
"feat_n_comments": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"feat_Unnamed: 0": "Value(dtype='int64', id=None)",
"feat_user_id": "Value(dtype='string', id=None)",
"feat_book_id": "Value(dtype='int64', id=None)",
"feat_review_id": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['0', '1', '2', '3', '4', '5'], id=None)",
"text": "Value(dtype='string', id=None)",
"feat_date_added": "Value(dtype='string', id=None)",
"feat_date_updated": "Value(dtype='string', id=None)",
"feat_read_at": "Value(dtype='string', id=None)",
"feat_started_at": "Value(dtype='string', id=None)",
"feat_n_votes": "Value(dtype='int64', id=None)",
"feat_n_comments": "Value(dtype='int64', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2397 |
| valid | 603 |
|
crumb/flan-ul2-tinystories-complex | ---
license: mit
language:
- en
---
Around a quarter of a million examples generated from Flan-UL2 (20b) with the prompt "Write a complex short story using the vocabulary of a third-grader." to be used in an experimental curriculum learning setting. I had to checkpoint every 1024 examples to mitigate the program slowing down due to memory usage. This was run in bf16 on an RTXA6000 with the following settings:
```
top_k = random between (40, 128)
temperature = random between (0.6, 0.95)
max_length = 128
batch_size = 32
```
I wanted a less uniform boring set with the same exact patterns so I randomly modulate the temperature and top_k values to get a good mix. This cost ~$6 usd to create on runpod. |
Ve11ichor/Song0.2kTestset | ---
license: apache-2.0
---
|
youngwoo3283/one_column_2000 | ---
size_categories:
- n<1K
--- |
liuyanchen1015/MULTI_VALUE_sst2_quotative_like | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: train
num_bytes: 426
num_examples: 3
download_size: 2429
dataset_size: 426
---
# Dataset Card for "MULTI_VALUE_sst2_quotative_like"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
haroldim/treino-voz-haroldo-ia-final | ---
license: openrail++
---
|
ziozzang/deepl-trans-ES-KO | ---
task_categories:
- translation
language:
- ko
- es
---
This dataset is some wikipedia article with DeepL translation, auto-aggregated.
# String/Corpus pairs
From ES/Spanish to KO/Korean.
# Quality Filtering
- Stripping whole HTML tags.
- removed references and annotation mark.
- Filtered by string length.
---
The strings/corpus are aggregated from wikipedia(pt) using DeepL translated.
whole data collected by Jioh L. Jung<ziozzang@gmail.com>
license: mit
--- |
lmqg/qa_squad | ---
license: cc-by-4.0
pretty_name: SQuAD with QG split.
language: en
multilinguality: monolingual
size_categories: 1M<
source_datasets:
- extended|wikipedia
task_categories:
- question-answering
task_ids:
- extractive-qa
---
# Dataset Card for "lmqg/qa_squad"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://rajpurkar.github.io/SQuAD-explorer/](https://rajpurkar.github.io/SQuAD-explorer/)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is the SQuAD v1 dataset with the train/validatio/test split used in [qg_squad](https://huggingface.co/datasets/lmqg/qg_squad).
### Supported Tasks and Leaderboards
* `question-answering`
### Languages
English (en)
## Dataset Structure
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `id`: a `string` feature of id
- `title`: a `string` feature of title of the paragraph
- `context`: a `string` feature of paragraph
- `question`: a `string` feature of question
- `answers`: a `json` feature of answers
### Data Splits
|train |validation|test |
|--------:|---------:|-------:|
| 75,722| 10,570| 11,877|
## Citation Information
```
@article{2016arXiv160605250R,
author = {{Rajpurkar}, Pranav and {Zhang}, Jian and {Lopyrev},
Konstantin and {Liang}, Percy},
title = "{SQuAD: 100,000+ Questions for Machine Comprehension of Text}",
journal = {arXiv e-prints},
year = 2016,
eid = {arXiv:1606.05250},
pages = {arXiv:1606.05250},
archivePrefix = {arXiv},
eprint = {1606.05250},
}
``` |
sasha/prof_images_blip__wavymulder-Analog-Diffusion | ---
dataset_info:
features:
- name: images
dtype: image
- name: embeddings
sequence: float32
splits:
- name: courier
num_bytes: 3774069.0
num_examples: 100
- name: aide
num_bytes: 3686691.0
num_examples: 100
- name: police_officer
num_bytes: 3716514.0
num_examples: 100
- name: purchasing_agent
num_bytes: 3374948.0
num_examples: 100
- name: metal_worker
num_bytes: 4585929.0
num_examples: 100
- name: financial_analyst
num_bytes: 3272085.0
num_examples: 100
- name: stocker
num_bytes: 4284000.0
num_examples: 100
- name: it_specialist
num_bytes: 3445262.0
num_examples: 100
- name: writer
num_bytes: 4338105.0
num_examples: 100
- name: accountant
num_bytes: 3273259.0
num_examples: 100
- name: coach
num_bytes: 4333295.0
num_examples: 100
- name: painter
num_bytes: 4207207.0
num_examples: 100
- name: real_estate_broker
num_bytes: 3744904.0
num_examples: 100
- name: truck_driver
num_bytes: 4744401.0
num_examples: 100
- name: data_entry_keyer
num_bytes: 4750907.0
num_examples: 100
- name: computer_support_specialist
num_bytes: 3220896.0
num_examples: 100
- name: cook
num_bytes: 3507117.0
num_examples: 100
- name: interior_designer
num_bytes: 3385993.0
num_examples: 100
- name: nutritionist
num_bytes: 4499939.0
num_examples: 100
- name: designer
num_bytes: 3262956.0
num_examples: 100
- name: maid
num_bytes: 3688106.0
num_examples: 100
- name: producer
num_bytes: 3855517.0
num_examples: 100
- name: executive_assistant
num_bytes: 2956660.0
num_examples: 100
- name: logistician
num_bytes: 3785521.0
num_examples: 100
- name: tractor_operator
num_bytes: 6024318.0
num_examples: 100
- name: doctor
num_bytes: 3241492.0
num_examples: 100
- name: inventory_clerk
num_bytes: 3888705.0
num_examples: 100
- name: sheet_metal_worker
num_bytes: 4317010.0
num_examples: 100
- name: groundskeeper
num_bytes: 5131469.0
num_examples: 100
- name: electrical_engineer
num_bytes: 4010184.0
num_examples: 100
- name: physical_therapist
num_bytes: 3392181.0
num_examples: 100
- name: insurance_agent
num_bytes: 3757883.0
num_examples: 100
- name: aerospace_engineer
num_bytes: 3796254.0
num_examples: 100
- name: psychologist
num_bytes: 3300681.0
num_examples: 100
- name: financial_advisor
num_bytes: 3319034.0
num_examples: 100
- name: printing_press_operator
num_bytes: 4371701.0
num_examples: 100
- name: architect
num_bytes: 3624303.0
num_examples: 100
- name: dental_hygienist
num_bytes: 3037225.0
num_examples: 100
- name: artist
num_bytes: 4038195.0
num_examples: 100
- name: office_worker
num_bytes: 3343369.0
num_examples: 100
- name: ceo
num_bytes: 3035277.0
num_examples: 100
- name: taxi_driver
num_bytes: 4532619.0
num_examples: 100
- name: librarian
num_bytes: 3934373.0
num_examples: 100
- name: author
num_bytes: 4016508.0
num_examples: 100
- name: plumber
num_bytes: 3932891.0
num_examples: 100
- name: construction_worker
num_bytes: 4155510.0
num_examples: 100
- name: clergy
num_bytes: 3781283.0
num_examples: 100
- name: electrician
num_bytes: 3783505.0
num_examples: 100
- name: jailer
num_bytes: 4507427.0
num_examples: 100
- name: credit_counselor
num_bytes: 3505147.0
num_examples: 100
- name: scientist
num_bytes: 4046533.0
num_examples: 100
- name: drywall_installer
num_bytes: 3478727.0
num_examples: 100
- name: school_bus_driver
num_bytes: 4890236.0
num_examples: 100
- name: dental_assistant
num_bytes: 2813410.0
num_examples: 100
- name: fitness_instructor
num_bytes: 3996469.0
num_examples: 100
- name: detective
num_bytes: 3422063.0
num_examples: 100
- name: hairdresser
num_bytes: 3241014.0
num_examples: 100
- name: welder
num_bytes: 4677109.0
num_examples: 100
- name: pharmacy_technician
num_bytes: 3700405.0
num_examples: 100
- name: compliance_officer
num_bytes: 3414977.0
num_examples: 100
- name: singer
num_bytes: 3802503.0
num_examples: 100
- name: tutor
num_bytes: 4062542.0
num_examples: 100
- name: language_pathologist
num_bytes: 3758118.0
num_examples: 100
- name: medical_records_specialist
num_bytes: 3271985.0
num_examples: 100
- name: sales_manager
num_bytes: 3205314.0
num_examples: 100
- name: industrial_engineer
num_bytes: 3971207.0
num_examples: 100
- name: manager
num_bytes: 3358224.0
num_examples: 100
- name: mechanic
num_bytes: 4067397.0
num_examples: 100
- name: postal_worker
num_bytes: 4003288.0
num_examples: 100
- name: computer_systems_analyst
num_bytes: 3539024.0
num_examples: 100
- name: salesperson
num_bytes: 3346595.0
num_examples: 100
- name: office_clerk
num_bytes: 3274748.0
num_examples: 100
- name: claims_appraiser
num_bytes: 5004316.0
num_examples: 100
- name: security_guard
num_bytes: 3794770.0
num_examples: 100
- name: interviewer
num_bytes: 3636369.0
num_examples: 100
- name: dispatcher
num_bytes: 3294510.0
num_examples: 100
- name: lawyer
num_bytes: 3196550.0
num_examples: 100
- name: marketing_manager
num_bytes: 3365180.0
num_examples: 100
- name: customer_service_representative
num_bytes: 3223272.0
num_examples: 100
- name: software_developer
num_bytes: 3333651.0
num_examples: 100
- name: mover
num_bytes: 4537574.0
num_examples: 100
- name: supervisor
num_bytes: 3841058.0
num_examples: 100
- name: paralegal
num_bytes: 3439628.0
num_examples: 100
- name: graphic_designer
num_bytes: 4234804.0
num_examples: 100
- name: dentist
num_bytes: 3106897.0
num_examples: 100
- name: roofer
num_bytes: 4839179.0
num_examples: 100
- name: public_relations_specialist
num_bytes: 3214669.0
num_examples: 100
- name: engineer
num_bytes: 3775481.0
num_examples: 100
- name: occupational_therapist
num_bytes: 3611377.0
num_examples: 100
- name: manicurist
num_bytes: 3099482.0
num_examples: 100
- name: cleaner
num_bytes: 4053227.0
num_examples: 100
- name: facilities_manager
num_bytes: 3761193.0
num_examples: 100
- name: repair_worker
num_bytes: 4110405.0
num_examples: 100
- name: cashier
num_bytes: 3631158.0
num_examples: 100
- name: baker
num_bytes: 3700422.0
num_examples: 100
- name: market_research_analyst
num_bytes: 3859395.0
num_examples: 100
- name: health_technician
num_bytes: 3182780.0
num_examples: 100
- name: veterinarian
num_bytes: 3550905.0
num_examples: 100
- name: underwriter
num_bytes: 3576463.0
num_examples: 100
- name: mechanical_engineer
num_bytes: 4339495.0
num_examples: 100
- name: janitor
num_bytes: 3784680.0
num_examples: 100
- name: pilot
num_bytes: 3669754.0
num_examples: 100
- name: therapist
num_bytes: 3484772.0
num_examples: 100
- name: director
num_bytes: 3533829.0
num_examples: 100
- name: wholesale_buyer
num_bytes: 4629384.0
num_examples: 100
- name: air_conditioning_installer
num_bytes: 4619514.0
num_examples: 100
- name: butcher
num_bytes: 4624676.0
num_examples: 100
- name: machinery_mechanic
num_bytes: 4698277.0
num_examples: 100
- name: event_planner
num_bytes: 4148009.0
num_examples: 100
- name: carpet_installer
num_bytes: 4831490.0
num_examples: 100
- name: musician
num_bytes: 3852103.0
num_examples: 100
- name: civil_engineer
num_bytes: 4235637.0
num_examples: 100
- name: farmer
num_bytes: 5696634.0
num_examples: 100
- name: financial_manager
num_bytes: 2996790.0
num_examples: 100
- name: childcare_worker
num_bytes: 4586909.0
num_examples: 100
- name: clerk
num_bytes: 3629737.0
num_examples: 100
- name: machinist
num_bytes: 3743309.0
num_examples: 100
- name: firefighter
num_bytes: 4238724.0
num_examples: 100
- name: photographer
num_bytes: 4709558.0
num_examples: 100
- name: file_clerk
num_bytes: 4227425.0
num_examples: 100
- name: bus_driver
num_bytes: 4414556.0
num_examples: 100
- name: fast_food_worker
num_bytes: 3558196.0
num_examples: 100
- name: bartender
num_bytes: 4220859.0
num_examples: 100
- name: computer_programmer
num_bytes: 3728830.0
num_examples: 100
- name: pharmacist
num_bytes: 3708399.0
num_examples: 100
- name: nursing_assistant
num_bytes: 3099367.0
num_examples: 100
- name: career_counselor
num_bytes: 3393143.0
num_examples: 100
- name: mental_health_counselor
num_bytes: 3215113.0
num_examples: 100
- name: network_administrator
num_bytes: 3856159.0
num_examples: 100
- name: teacher
num_bytes: 4085339.0
num_examples: 100
- name: dishwasher
num_bytes: 4261050.0
num_examples: 100
- name: teller
num_bytes: 3322494.0
num_examples: 100
- name: teaching_assistant
num_bytes: 3491372.0
num_examples: 100
- name: payroll_clerk
num_bytes: 3263271.0
num_examples: 100
- name: laboratory_technician
num_bytes: 3271155.0
num_examples: 100
- name: social_assistant
num_bytes: 3774971.0
num_examples: 100
- name: radiologic_technician
num_bytes: 3021348.0
num_examples: 100
- name: social_worker
num_bytes: 4232132.0
num_examples: 100
- name: nurse
num_bytes: 3272832.0
num_examples: 100
- name: receptionist
num_bytes: 3134671.0
num_examples: 100
- name: carpenter
num_bytes: 4402559.0
num_examples: 100
- name: correctional_officer
num_bytes: 3789430.0
num_examples: 100
- name: community_manager
num_bytes: 3756220.0
num_examples: 100
- name: massage_therapist
num_bytes: 2980706.0
num_examples: 100
- name: head_cook
num_bytes: 3919248.0
num_examples: 100
- name: plane_mechanic
num_bytes: 3715118.0
num_examples: 100
download_size: 581741855
dataset_size: 557958672.0
---
# Dataset Card for "prof_images_blip__wavymulder-Analog-Diffusion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TohidA/MONA | ---
dataset_name: MONA
dataset_type: tabular
task_categories: [tabular-classification, tabular-regression]
---
#MONA Arrangements Dataset
A publicly avialabe dataset published here: https://www.imf.org/external/np/pdr/mona/QueryReportLabelsAndDescriptions.aspx
license: openrail
dataset_info:
features:
- name: Arrangement Number
dtype: int64
- name: Country Name
dtype: string
- name: Country Code
dtype: int64
- name: Arrangement Type
dtype: string
- name: Approval date
dtype: string
- name: Approval Year
dtype: int64
- name: Initial End Date
dtype: string
- name: Initial End Year
dtype: int64
- name: Revised End Date
dtype: string
- name: Duration Of Annual Arrangement From
dtype: string
- name: Duration Of Annual Arrangement To
dtype: string
- name: Board Action Date
dtype: string
- name: Program Type
dtype: string
- name: Review Type
dtype: string
- name: Review Status
dtype: string
- name: Key Code
dtype: string
- name: Economic Code
dtype: float64
- name: Economic Descriptor
dtype: string
- name: Description
dtype: string
- name: Description Code
dtype: int64
- name: Test Date
dtype: string
- name: PC Status
dtype: string
- name: Comments
dtype: string
- name: Sort
dtype: int64
- name: EsOrder
dtype: int64
- name: NewTestDate
dtype: string
- name: Added At
dtype: string
- name: Assessed At
dtype: string
- name: Unique ID
dtype: string
- name: Parent ID
dtype: string
splits:
- name: train
num_bytes: 25540700
num_examples: 48988
download_size: 0
dataset_size: 25540700
configs:
- config_name: default
data_files:
- split: train
path: data/train-* |
tyzhu/squad_qa_baseline_v5_full_last_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2496440.0
num_examples: 2385
- name: validation
num_bytes: 335684
num_examples: 300
download_size: 0
dataset_size: 2832124.0
---
# Dataset Card for "squad_qa_baseline_v5_full_last_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vencortex/DeOSAgentDocuments | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: company_id
dtype: string
- name: context_id
dtype: string
- name: source
dtype: string
- name: date
dtype: string
- name: text
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 33884007
num_examples: 10000
download_size: 29585235
dataset_size: 33884007
---
# Dataset Card for "DeOSAgentDocuments"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
2030NLP/SpaCE2023 | ---
task_categories:
- text-classification
- text-generation
- feature-extraction
language:
- zh
size_categories:
- 1M<n<10M
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
A dataset for Chinese Spatial Semantics Understanding.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [Department of Chinese Language and Literature, Peking University]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [Chinese]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [https://github.com/2030NLP/SpaCE2023]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maidalun1020/CrosslingualRetrievalBooksEn2Zh-qrels | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
dataset_info:
features:
- name: qid
dtype: string
- name: pid
dtype: string
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 766510
num_examples: 31411
download_size: 410843
dataset_size: 766510
---
|
ThWu/truthful_benchmark | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: response_c
dtype: string
- name: ranked_responses
sequence: string
splits:
- name: train
num_bytes: 315878
num_examples: 817
download_size: 179861
dataset_size: 315878
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlanYky/climate-with-instruction-with-label | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 4023178
num_examples: 800
download_size: 1812429
dataset_size: 4023178
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wyuelin/testDataset | ---
license: apache-2.0
---
|
liuqingwen/github-issues | ---
license: mit
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 9688804
num_examples: 1000
download_size: 2589093
dataset_size: 9688804
---
|
vietgpt/qwen-nmt | ---
dataset_info:
features:
- name: laser_score
dtype: float64
- name: lang1
dtype: string
- name: text1
dtype: string
- name: lang2
dtype: string
- name: text2
dtype: string
- name: blaser_sim
dtype: float64
- name: source
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 79362497
num_examples: 108000
download_size: 35487099
dataset_size: 79362497
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kmpartner/dummy-dataset-github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
dtype: int64
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 20688424
num_examples: 6737
download_size: 5076727
dataset_size: 20688424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
p1atdev/badmitsua | ---
license: cc0-1.0
---
[ミツア](https://huggingface.co/Mitsua/mitsua-diffusion-one) 用 ネガティブ TI
## Test1
TI
- [badmitsua-test1-e10.pt](https://huggingface.co/datasets/p1atdev/badmitsua/blob/main/embeddings/badmitsua-test1-e10.pt)
データセット
mitsua-diffusion-one-base で生成した 150枚を使用
- [test1.zip](https://huggingface.co/datasets/p1atdev/badmitsua/blob/main/test1.zip)
|
DrBenchmark/DiaMED | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_hamxea__Llama-2-7b-chat-hf-activity-fine-tuned-v4 | ---
pretty_name: Evaluation run of hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4](https://huggingface.co/hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hamxea__Llama-2-7b-chat-hf-activity-fine-tuned-v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T18:33:35.231026](https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Llama-2-7b-chat-hf-activity-fine-tuned-v4/blob/main/results_2024-03-31T18-33-35.231026.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48571884984069763,\n\
\ \"acc_stderr\": 0.03429845395138502,\n \"acc_norm\": 0.4904132318843078,\n\
\ \"acc_norm_stderr\": 0.035053692176394785,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589664,\n \"mc2\": 0.45774742631279514,\n\
\ \"mc2_stderr\": 0.01544325793353912\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \
\ \"acc_norm\": 0.5426621160409556,\n \"acc_norm_stderr\": 0.014558106543924067\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5860386377215694,\n\
\ \"acc_stderr\": 0.00491535110731875,\n \"acc_norm\": 0.7810197171878112,\n\
\ \"acc_norm_stderr\": 0.0041271002813796\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.030709486992556545,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.030709486992556545\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924314,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924314\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5258064516129032,\n\
\ \"acc_stderr\": 0.02840609505765332,\n \"acc_norm\": 0.5258064516129032,\n\
\ \"acc_norm_stderr\": 0.02840609505765332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561953,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561953\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6060606060606061,\n \"acc_stderr\": 0.034812853382329624,\n \"\
acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.034812853382329624\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4256410256410256,\n \"acc_stderr\": 0.02506909438729654,\n \
\ \"acc_norm\": 0.4256410256410256,\n \"acc_norm_stderr\": 0.02506909438729654\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03214536859788639,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03214536859788639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.671559633027523,\n \"acc_stderr\": 0.02013590279729841,\n \"acc_norm\"\
: 0.671559633027523,\n \"acc_norm_stderr\": 0.02013590279729841\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.32407407407407407,\n\
\ \"acc_stderr\": 0.03191923445686186,\n \"acc_norm\": 0.32407407407407407,\n\
\ \"acc_norm_stderr\": 0.03191923445686186\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488419,\n\
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488419\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610784,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610784\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.03314190222110658,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.03314190222110658\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.669220945083014,\n\
\ \"acc_stderr\": 0.01682481846256375,\n \"acc_norm\": 0.669220945083014,\n\
\ \"acc_norm_stderr\": 0.01682481846256375\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.20670391061452514,\n\
\ \"acc_stderr\": 0.013543260867834462,\n \"acc_norm\": 0.20670391061452514,\n\
\ \"acc_norm_stderr\": 0.013543260867834462\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5627009646302251,\n\
\ \"acc_stderr\": 0.02817391776176289,\n \"acc_norm\": 0.5627009646302251,\n\
\ \"acc_norm_stderr\": 0.02817391776176289\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607697,\n\
\ \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607697\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.02878222756134724,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.02878222756134724\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3455019556714472,\n\
\ \"acc_stderr\": 0.012145303004087206,\n \"acc_norm\": 0.3455019556714472,\n\
\ \"acc_norm_stderr\": 0.012145303004087206\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.030306257722468317,\n\
\ \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.030306257722468317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.48366013071895425,\n \"acc_stderr\": 0.020217030653186453,\n \
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.020217030653186453\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589664,\n \"mc2\": 0.45774742631279514,\n\
\ \"mc2_stderr\": 0.01544325793353912\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998292\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19257012888551933,\n \
\ \"acc_stderr\": 0.010861483868509941\n }\n}\n```"
repo_url: https://huggingface.co/hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|arc:challenge|25_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|arc:challenge|25_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|gsm8k|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|gsm8k|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hellaswag|10_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hellaswag|10_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T18-32-16.094801.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T18-33-35.231026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T18-33-35.231026.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- '**/details_harness|winogrande|5_2024-03-31T18-32-16.094801.parquet'
- split: 2024_03_31T18_33_35.231026
path:
- '**/details_harness|winogrande|5_2024-03-31T18-33-35.231026.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T18-33-35.231026.parquet'
- config_name: results
data_files:
- split: 2024_03_31T18_32_16.094801
path:
- results_2024-03-31T18-32-16.094801.parquet
- split: 2024_03_31T18_33_35.231026
path:
- results_2024-03-31T18-33-35.231026.parquet
- split: latest
path:
- results_2024-03-31T18-33-35.231026.parquet
---
# Dataset Card for Evaluation run of hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4](https://huggingface.co/hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hamxea__Llama-2-7b-chat-hf-activity-fine-tuned-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T18:33:35.231026](https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Llama-2-7b-chat-hf-activity-fine-tuned-v4/blob/main/results_2024-03-31T18-33-35.231026.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48571884984069763,
"acc_stderr": 0.03429845395138502,
"acc_norm": 0.4904132318843078,
"acc_norm_stderr": 0.035053692176394785,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589664,
"mc2": 0.45774742631279514,
"mc2_stderr": 0.01544325793353912
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.5426621160409556,
"acc_norm_stderr": 0.014558106543924067
},
"harness|hellaswag|10": {
"acc": 0.5860386377215694,
"acc_stderr": 0.00491535110731875,
"acc_norm": 0.7810197171878112,
"acc_norm_stderr": 0.0041271002813796
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.030709486992556545,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.030709486992556545
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924314,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924314
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561953,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561953
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.034812853382329624,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.034812853382329624
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4256410256410256,
"acc_stderr": 0.02506909438729654,
"acc_norm": 0.4256410256410256,
"acc_norm_stderr": 0.02506909438729654
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03214536859788639,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03214536859788639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.671559633027523,
"acc_stderr": 0.02013590279729841,
"acc_norm": 0.671559633027523,
"acc_norm_stderr": 0.02013590279729841
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610784,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610784
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.03314190222110658,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.03314190222110658
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.669220945083014,
"acc_stderr": 0.01682481846256375,
"acc_norm": 0.669220945083014,
"acc_norm_stderr": 0.01682481846256375
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.02690290045866664,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.02690290045866664
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20670391061452514,
"acc_stderr": 0.013543260867834462,
"acc_norm": 0.20670391061452514,
"acc_norm_stderr": 0.013543260867834462
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280544,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280544
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5627009646302251,
"acc_stderr": 0.02817391776176289,
"acc_norm": 0.5627009646302251,
"acc_norm_stderr": 0.02817391776176289
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.027586006221607697,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.027586006221607697
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.02878222756134724,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.02878222756134724
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3455019556714472,
"acc_stderr": 0.012145303004087206,
"acc_norm": 0.3455019556714472,
"acc_norm_stderr": 0.012145303004087206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.030306257722468317,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.030306257722468317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.020217030653186453,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.020217030653186453
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589664,
"mc2": 0.45774742631279514,
"mc2_stderr": 0.01544325793353912
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998292
},
"harness|gsm8k|5": {
"acc": 0.19257012888551933,
"acc_stderr": 0.010861483868509941
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
arbitropy/bquac_new_answers | ---
dataset_info:
features:
- name: questions
sequence: string
- name: source
dtype: string
- name: en_questions
sequence: string
- name: en_answer_spans
sequence: string
- name: questions_scores
sequence: float64
- name: id
dtype: int64
- name: answers
sequence: string
- name: story
dtype: string
- name: answers_scores
sequence: float64
splits:
- name: train
num_bytes: 57271950
num_examples: 11567
- name: validation
num_bytes: 5298726
num_examples: 1000
download_size: 32871688
dataset_size: 62570676
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CyberHarem/qingque_starrail | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of qingque/青雀/青雀/청작 (Honkai: Star Rail)
This is the dataset of qingque/青雀/青雀/청작 (Honkai: Star Rail), containing 163 images and their tags.
The core tags of this character are `long_hair, green_eyes, hair_ornament, bangs, brown_hair, hairclip, hair_between_eyes, breasts, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 163 | 299.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qingque_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 163 | 136.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qingque_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 412 | 308.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qingque_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 163 | 248.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qingque_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 412 | 486.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qingque_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/qingque_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, bare_shoulders, long_sleeves, simple_background, white_background, open_mouth, virtual_youtuber, blush, dress, mahjong_tile, skirt, clothing_cutout, holding |
| 1 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, toes, barefoot, closed_mouth, soles, bare_legs, bare_shoulders, blush, foot_focus, smile, :3, sitting, cleavage_cutout, dress, foreshortening, large_breasts |
| 2 | 5 |  |  |  |  |  | 1girl, arms_behind_back, bare_shoulders, bondage, restrained, solo, dress, gagged, looking_at_viewer, red_rope, bound_arms, bound_legs, cleavage, cloth_gag, clothing_cutout, indoors, shibari_over_clothes |
| 3 | 11 |  |  |  |  |  | 1girl, hetero, 1boy, nipples, sex, blush, open_mouth, penis, solo_focus, cum_in_pussy, virtual_youtuber, braid, sweat, vaginal, mosaic_censoring, small_breasts, spread_legs, completely_nude, heart, pov |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | smile | bare_shoulders | long_sleeves | simple_background | white_background | open_mouth | virtual_youtuber | blush | dress | mahjong_tile | skirt | clothing_cutout | holding | toes | barefoot | closed_mouth | soles | bare_legs | foot_focus | :3 | sitting | cleavage_cutout | foreshortening | large_breasts | arms_behind_back | bondage | restrained | gagged | red_rope | bound_arms | bound_legs | cleavage | cloth_gag | indoors | shibari_over_clothes | hetero | 1boy | nipples | sex | penis | solo_focus | cum_in_pussy | braid | sweat | vaginal | mosaic_censoring | small_breasts | spread_legs | completely_nude | heart | pov |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-----------------|:---------------|:--------------------|:-------------------|:-------------|:-------------------|:--------|:--------|:---------------|:--------|:------------------|:----------|:-------|:-----------|:---------------|:--------|:------------|:-------------|:-----|:----------|:------------------|:-----------------|:----------------|:-------------------|:----------|:-------------|:---------|:-----------|:-------------|:-------------|:-----------|:------------|:----------|:-----------------------|:---------|:-------|:----------|:------|:--------|:-------------|:---------------|:--------|:--------|:----------|:-------------------|:----------------|:--------------|:------------------|:--------|:------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | | | | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | | | | | | | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
maghwa/OpenHermes-2-AR-10K-46-900k-910k | ---
dataset_info:
features:
- name: id
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: topic
dtype: 'null'
- name: hash
dtype: 'null'
- name: model
dtype: 'null'
- name: idx
dtype: 'null'
- name: title
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: conversations
dtype: string
- name: model_name
dtype: 'null'
- name: source
dtype: string
- name: skip_prompt_formatting
dtype: 'null'
- name: language
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: category
dtype: 'null'
- name: views
dtype: float64
splits:
- name: train
num_bytes: 28769438
num_examples: 10001
download_size: 11093895
dataset_size: 28769438
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_rhaymison__Mistral-portuguese-luana-7b-chat | ---
pretty_name: Evaluation run of rhaymison/Mistral-portuguese-luana-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rhaymison/Mistral-portuguese-luana-7b-chat](https://huggingface.co/rhaymison/Mistral-portuguese-luana-7b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rhaymison__Mistral-portuguese-luana-7b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T13:48:54.646631](https://huggingface.co/datasets/open-llm-leaderboard/details_rhaymison__Mistral-portuguese-luana-7b-chat/blob/main/results_2024-04-15T13-48-54.646631.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6061458325666466,\n\
\ \"acc_stderr\": 0.03323939711070773,\n \"acc_norm\": 0.6116093943146148,\n\
\ \"acc_norm_stderr\": 0.03391645488627437,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907922,\n \"mc2\": 0.5459664955224175,\n\
\ \"mc2_stderr\": 0.015288575747089443\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5409556313993175,\n \"acc_stderr\": 0.01456229107360123,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.01435639941800912\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.611431985660227,\n\
\ \"acc_stderr\": 0.004864286176731837,\n \"acc_norm\": 0.813981278629755,\n\
\ \"acc_norm_stderr\": 0.003883265210791707\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849725,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849725\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.029373646253234686,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.029373646253234686\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033581,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033581\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6935483870967742,\n\
\ \"acc_stderr\": 0.02622648565255388,\n \"acc_norm\": 0.6935483870967742,\n\
\ \"acc_norm_stderr\": 0.02622648565255388\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072388,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072388\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039504,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039504\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101074,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101074\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n\
\ \"acc_stderr\": 0.016361354769822468,\n \"acc_norm\": 0.39664804469273746,\n\
\ \"acc_norm_stderr\": 0.016361354769822468\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281413,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281413\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778855,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n\
\ \"acc_stderr\": 0.012596744108998557,\n \"acc_norm\": 0.4178617992177314,\n\
\ \"acc_norm_stderr\": 0.012596744108998557\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6062091503267973,\n \"acc_stderr\": 0.01976621199107307,\n \
\ \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.01976621199107307\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907922,\n \"mc2\": 0.5459664955224175,\n\
\ \"mc2_stderr\": 0.015288575747089443\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.01196129890580315\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3821076573161486,\n \
\ \"acc_stderr\": 0.013384173935648494\n }\n}\n```"
repo_url: https://huggingface.co/rhaymison/Mistral-portuguese-luana-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|arc:challenge|25_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|gsm8k|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hellaswag|10_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T13-48-54.646631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T13-48-54.646631.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- '**/details_harness|winogrande|5_2024-04-15T13-48-54.646631.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T13-48-54.646631.parquet'
- config_name: results
data_files:
- split: 2024_04_15T13_48_54.646631
path:
- results_2024-04-15T13-48-54.646631.parquet
- split: latest
path:
- results_2024-04-15T13-48-54.646631.parquet
---
# Dataset Card for Evaluation run of rhaymison/Mistral-portuguese-luana-7b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rhaymison/Mistral-portuguese-luana-7b-chat](https://huggingface.co/rhaymison/Mistral-portuguese-luana-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rhaymison__Mistral-portuguese-luana-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T13:48:54.646631](https://huggingface.co/datasets/open-llm-leaderboard/details_rhaymison__Mistral-portuguese-luana-7b-chat/blob/main/results_2024-04-15T13-48-54.646631.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6061458325666466,
"acc_stderr": 0.03323939711070773,
"acc_norm": 0.6116093943146148,
"acc_norm_stderr": 0.03391645488627437,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907922,
"mc2": 0.5459664955224175,
"mc2_stderr": 0.015288575747089443
},
"harness|arc:challenge|25": {
"acc": 0.5409556313993175,
"acc_stderr": 0.01456229107360123,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.01435639941800912
},
"harness|hellaswag|10": {
"acc": 0.611431985660227,
"acc_stderr": 0.004864286176731837,
"acc_norm": 0.813981278629755,
"acc_norm_stderr": 0.003883265210791707
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849725,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849725
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.029373646253234686,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.029373646253234686
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033581,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033581
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562417,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6935483870967742,
"acc_stderr": 0.02622648565255388,
"acc_norm": 0.6935483870967742,
"acc_norm_stderr": 0.02622648565255388
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072388,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072388
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763079,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763079
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039504,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039504
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101074,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.016361354769822468,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.016361354769822468
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281413,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603742,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603742
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778855,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.012596744108998557,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.012596744108998557
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.01976621199107307,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.01976621199107307
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907922,
"mc2": 0.5459664955224175,
"mc2_stderr": 0.015288575747089443
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.01196129890580315
},
"harness|gsm8k|5": {
"acc": 0.3821076573161486,
"acc_stderr": 0.013384173935648494
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/lmind_hotpot_train300_eval100_v1_reciteonly_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 51441
num_examples: 300
- name: train_recite_qa
num_bytes: 312070
num_examples: 300
- name: eval_qa
num_bytes: 16148
num_examples: 100
- name: eval_recite_qa
num_bytes: 104950
num_examples: 100
- name: all_docs
num_bytes: 361191
num_examples: 797
- name: all_docs_eval
num_bytes: 361140
num_examples: 797
- name: train
num_bytes: 312070
num_examples: 300
- name: validation
num_bytes: 104950
num_examples: 100
download_size: 817849
dataset_size: 1623960
---
# Dataset Card for "lmind_hotpot_train300_eval100_v1_reciteonly_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Azam/Pippi | ---
license: apache-2.0
---
|
loubnabnl/pre-processed-issues | ---
dataset_info:
features:
- name: repo
dtype: string
- name: org
dtype: string
- name: issue_id
dtype: int64
- name: issue_number
dtype: int64
- name: pull_request
struct:
- name: number
dtype: int64
- name: repo
dtype: string
- name: user_login
dtype: string
- name: events
list:
- name: action
dtype: string
- name: author
dtype: string
- name: comment_id
dtype: float64
- name: datetime
dtype: int64
- name: masked_author
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: type
dtype: string
- name: text_size
dtype: int64
- name: bot_issue
dtype: bool
- name: modified_by_bot
dtype: bool
- name: user_count
dtype: int64
- name: event_count
dtype: int64
- name: modified_usernames
dtype: bool
splits:
- name: train
num_bytes: 15607937
num_examples: 6759
download_size: 7397345
dataset_size: 15607937
---
# Dataset Card for "pre-processed-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DTU54DL/common-accent-augmented-proc | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- mit
multilinguality:
- monolingual
paperswithcode_id: acronym-identification
pretty_name: Acronym Identification Dataset
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- token-classification-other-acronym-identification
train-eval-index:
- col_mapping:
labels: tags
tokens: tokens
config: default
splits:
eval_split: test
task: token-classification
task_id: entity_extraction
dataset_info:
features:
- name: sentence
dtype: string
- name: accent
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 433226048
num_examples: 451
- name: train
num_bytes: 9606026408
num_examples: 10000
download_size: 2307292790
dataset_size: 10039252456
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
thobauma/harmless-poisoned-0.005-SuperGodModeActivated-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rntc/few_shot_ncbi_disease_pubmed | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: gold
dtype: string
- name: doc_id
dtype: int64
- name: sent_offset
sequence: int64
- name: sent_len
dtype: int64
- name: context
dtype: string
splits:
- name: train
num_bytes: 4272870
num_examples: 978
download_size: 659920
dataset_size: 4272870
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yzhuang/metatree_RandomRBF_10_1E_4 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 70077600
num_examples: 700776
- name: validation
num_bytes: 29922400
num_examples: 299224
download_size: 103910649
dataset_size: 100000000
---
# Dataset Card for "metatree_RandomRBF_10_1E_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
recursix/geo-bench-1.0 | ---
license: cc-by-sa-4.0
pretty_name: GEO-Bench 1.0
size_categories:
- 10B<n<100B
--- |
tyzhu/random_letter_find_passage_train30_eval40_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10027
num_examples: 100
- name: validation
num_bytes: 5089
num_examples: 40
download_size: 11357
dataset_size: 15116
---
# Dataset Card for "random_letter_find_passage_train30_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Khalida1w/funny_quotes | ---
license: apache-2.0
---
|
AiresPucrs/movielens-movies | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: movieId
dtype: int64
- name: title
dtype: string
- name: genres
dtype: string
splits:
- name: train
num_bytes: 563045
num_examples: 9742
download_size: 300293
dataset_size: 563045
language:
- en
pretty_name: Movielens-movies
size_categories:
- 1K<n<10K
license: other
---
# Movielens-movies
This dataset contains a set of movies from the MovieLens website, a movie recommendation service.
## Overview
MovieLens data sets were collected by the GroupLens Research Project at the University of Minnesota.
The GroupLens Research has collected and made available rating data sets from the [MovieLens website](https://movielens.org).
MovieLens 100K movie ratings contain 100,000 ratings(1-5)from 943 users on 1682 movies. Released 1998.
## Dataset Details
The dataset from Kaggle is named [MovieLens100](https://www.kaggle.com/datasets/abhikjha/movielens-100k).
Contains different CSV files for Movies, Ratings, Links, and Tags. We used only the file "movies.csv" in **movielens-movies dataset**.
- Dataset Name: movielens-movies
- Language: English
- Total Size: 9,742 demonstrations
**Citation:**
```latex
@article{10.1145/2827872,
author = {Harper, F. Maxwell and Konstan, Joseph A.},
title = {The MovieLens Datasets: History and Context},
year = {2015},
issue_date = {January 2016},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {5},
number = {4},
issn = {2160-6455},
url = {https://doi.org/10.1145/2827872},
doi = {10.1145/2827872},
journal = {ACM Trans. Interact. Intell. Syst.},
month = dec,
articleno = {19},
numpages = {19},
keywords = {Datasets, recommendations, ratings, MovieLens}
}
```
## Contents
The dataset consists of a data frame with the following columns:
- **movieID** is a unique identifier of the rated movie.
- **title:** the title of the rated movie with the release year in parentheses.
- **genres:** a sequence of genres to which the rated movie belongs.
```bash
{
movieID: 2,
title: "Jumanji (1995)",
genres: "Adventure|Children|Fantasy"
}
```
## How to use
```python
from datasets import load_dataset
dataset = load_dataset("AiresPucrs/movielens-movies", split='train')
```
## License
This dataset is licensed under the USAGE LICENSE - [Other](https://files.grouplens.org/datasets/movielens/ml-100k-README.txt). |
jamesLeeeeeee/datasets-github-issues | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- code
size_categories:
- 10M<n<100M
--- |
result-kand2-sdxl-wuerst-karlo/463b7b19 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 161
num_examples: 10
download_size: 1299
dataset_size: 161
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "463b7b19"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
taejunkim/alignments | ---
dataset_info:
features:
- name: mix_id
dtype: string
- name: track_id
dtype: string
- name: case_name
dtype: string
- name: feature
dtype: string
- name: metric
dtype: string
- name: key_change
dtype: int64
- name: match_rate
dtype: float64
- name: match_rate_raw
dtype: float64
- name: matched_beats
dtype: int64
- name: matched_beats_raw
dtype: int64
- name: matched_time_mix
dtype: float64
- name: matched_time_track
dtype: float64
- name: mix_cue_in_beat
dtype: float64
- name: mix_cue_out_beat
dtype: float64
- name: track_cue_in_beat
dtype: float64
- name: track_cue_out_beat
dtype: float64
- name: mix_cue_in_time
dtype: float64
- name: mix_cue_out_time
dtype: float64
- name: track_cue_in_time
dtype: float64
- name: track_cue_out_time
dtype: float64
- name: cost
dtype: float64
- name: __index_level_0__
dtype: int64
- name: wp
sequence:
sequence: int64
splits:
- name: train
num_bytes: 22961341
num_examples: 6600
download_size: 3089520
dataset_size: 22961341
---
# Dataset Card for "alignments"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_tiiuae__falcon-40b | ---
pretty_name: Evaluation run of tiiuae/falcon-40b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tiiuae/falcon-40b](https://huggingface.co/tiiuae/falcon-40b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 124 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the aggregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tiiuae__falcon-40b\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T19:45:58.201621](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-40b/blob/main/results_2023-12-03T19-45-58.201621.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.21455648218347234,\n\
\ \"acc_stderr\": 0.011307604104052885\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.21455648218347234,\n \"acc_stderr\": 0.011307604104052885\n\
\ }\n}\n```"
repo_url: https://huggingface.co/tiiuae/falcon-40b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|arc:challenge|25_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_08T21_43_04.856041
path:
- '**/details_harness|drop|3_2023-09-08T21-43-04.856041.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-08T21-43-04.856041.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_08T21_43_04.856041
path:
- '**/details_harness|gsm8k|5_2023-09-08T21-43-04.856041.parquet'
- split: 2023_12_03T19_45_58.201621
path:
- '**/details_harness|gsm8k|5_2023-12-03T19-45-58.201621.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T19-45-58.201621.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hellaswag|10_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_0
data_files:
- split: 2023_08_21T11_07_51.058817
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T11:07:51.058817.parquet'
- split: 2023_08_21T11_30_10.858708
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T11:30:10.858708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T11:30:10.858708.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-21T22:49:59.134750.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_0
data_files:
- split: 2023_08_21T11_07_51.058817
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T11:07:51.058817.parquet'
- split: 2023_08_21T11_30_10.858708
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T11:30:10.858708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|0_2023-08-21T11:30:10.858708.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_21T22_49_59.134750
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-21T22:49:59.134750.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-21T22:49:59.134750.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_08T21_43_04.856041
path:
- '**/details_harness|winogrande|5_2023-09-08T21-43-04.856041.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-08T21-43-04.856041.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:17:39.708485.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:17:39.708485.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_17_39.708485
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:17:39.708485.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:17:39.708485.parquet'
- config_name: results
data_files:
- split: 2023_08_21T11_07_51.058817
path:
- results_2023-08-21T11:07:51.058817.parquet
- split: 2023_08_21T11_30_10.858708
path:
- results_2023-08-21T11:30:10.858708.parquet
- split: 2023_08_21T22_49_59.134750
path:
- results_2023-08-21T22:49:59.134750.parquet
- split: 2023_08_28T20_17_39.708485
path:
- results_2023-08-28T20:17:39.708485.parquet
- split: 2023_09_08T21_43_04.856041
path:
- results_2023-09-08T21-43-04.856041.parquet
- split: 2023_12_03T19_45_58.201621
path:
- results_2023-12-03T19-45-58.201621.parquet
- split: latest
path:
- results_2023-12-03T19-45-58.201621.parquet
---
# Dataset Card for Evaluation run of tiiuae/falcon-40b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/tiiuae/falcon-40b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [tiiuae/falcon-40b](https://huggingface.co/tiiuae/falcon-40b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 124 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tiiuae__falcon-40b",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T19:45:58.201621](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-40b/blob/main/results_2023-12-03T19-45-58.201621.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.21455648218347234,
"acc_stderr": 0.011307604104052885
},
"harness|gsm8k|5": {
"acc": 0.21455648218347234,
"acc_stderr": 0.011307604104052885
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
akjindal53244/200k_replaced_SNI_w_random_SNI | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: train_dataset.json
- split: test
path: eval_dataset.json
---
|
HowardTan/garden-blip-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 158170155.0
num_examples: 811
download_size: 157791101
dataset_size: 158170155.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
result-kand2-sdxl-wuerst-karlo/b745e329 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 235
num_examples: 10
download_size: 1403
dataset_size: 235
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "b745e329"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/1245832e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 36
num_examples: 2
download_size: 1264
dataset_size: 36
---
# Dataset Card for "1245832e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
orderofmagnitude/alpaca_dataset.json | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_rte_double_past | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 171024
num_examples: 359
- name: train
num_bytes: 134062
num_examples: 282
download_size: 204183
dataset_size: 305086
---
# Dataset Card for "MULTI_VALUE_rte_double_past"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_49 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1230653600
num_examples: 239800
download_size: 1257701129
dataset_size: 1230653600
---
# Dataset Card for "chunk_49"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
devishi-raizada/reuters_articles | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 247933
num_examples: 462
- name: vaidation
num_bytes: 42653
num_examples: 58
- name: test
num_bytes: 54849
num_examples: 58
download_size: 219341
dataset_size: 345435
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: vaidation
path: data/vaidation-*
- split: test
path: data/test-*
---
|
lordsymbol/zeu | ---
license: openrail
---
|
puromusculo/gustavolima1 | ---
license: openrail
---
|
umarzein/databricks-dolly-15k-id | ---
license: cc-by-sa-3.0
---
status: incomplete (need further adjustments)
This dataset was created by translating "databricks-dolly-15k.jsonl" from english into indonesian using facebook/m2m100_418M and applying further adjustments.
Further adjustments includes:
1. fixing words which are still in english
2. adjusting responses which start with stopwords e.g.: "oleh", "di", "dengan"
3. fixing repetitions which occur in multi-line text ("Everything Everything Everything Everything ...")
This dataset can be used for any purpose, whether academic or commercial, under the terms of the Creative Commons Attribution-ShareAlike 3.0 Unported License.
## Caveats
The current databricks' dolly 15k dataset may not completely match with this one
Row indeces that contain repetition erorrs (207):
96
112
262
273
369
376
389
410
415
432
581
586
597
685
870
886
936
957
964
979
985
1025
1120
1216
1223
1246
1251
1262
1316
1495
1552
1614
1684
1697
1733
1756
1808
1878
1893
2060
2118
2152
2168
2464
2474
2615
2663
2712
2829
2971
3046
3068
3123
3154
3178
3289
3336
3340
3401
3545
3574
3593
3599
3629
3745
3883
3889
3896
3967
3978
3993
4181
4186
4220
4232
4338
4358
4460
4497
4516
4614
4645
4689
4757
4809
4826
4865
5107
5232
5266
5296
5418
5493
5754
5791
5797
5819
5852
5968
6354
6409
6481
6499
6553
6555
6580
6659
6866
6911
6944
7020
7074
7116
7169
7390
7599
7777
7787
7846
7870
7894
8036
8051
8090
8144
8188
8294
8349
8406
8471
8527
8546
8552
8777
8836
8852
9026
9133
9136
9186
9287
9329
9335
9365
9475
9508
9509
9607
9630
9701
9731
9790
9822
9855
10214
10251
10308
10475
10536
10546
10683
10776
10803
10972
11069
11085
11199
11334
11350
11407
11421
11540
11570
11658
11758
11774
12004
12064
12374
12380
12519
12591
12623
12764
12844
12849
12923
12926
12953
13099
13225
13231
13352
13428
13602
13634
13810
13833
13851
13893
14021
14097
14145
14234
14240
14826
14884
|
EnergyStarAI/sentence_similarity | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
splits:
- name: train
num_bytes: 189609
num_examples: 1000
download_size: 141735
dataset_size: 189609
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davanstrien/ai4lam-demo2 | ---
dataset_info:
features:
- name: metadata_text
dtype: string
- name: label
dtype:
class_label:
names:
0: Low_Quality
1: High_Quality
- name: source
dtype: string
splits:
- name: train
num_bytes: 29309108
num_examples: 100821
download_size: 16023375
dataset_size: 29309108
---
# Dataset Card for "ai4lam-demo2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/oasst | ---
dataset_info:
features:
- name: message_tree_id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 15836903
num_examples: 9823
download_size: 9334076
dataset_size: 15836903
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
larryvrh/PIPPA-TavernFormat | ---
dataset_info:
features:
- name: categories
sequence: string
- name: name
dtype: string
- name: description
dtype: string
- name: first_msg
dtype: string
- name: personality
dtype: string
- name: example_dialogues
sequence: string
- name: conversation
list:
- name: is_human
dtype: bool
- name: message
dtype: string
splits:
- name: train
num_bytes: 174673097
num_examples: 11841
download_size: 88204818
dataset_size: 174673097
license: agpl-3.0
task_categories:
- conversational
language:
- en
tags:
- not-for-all-audiences
- roleplay
- conversational
size_categories:
- 10K<n<100K
---
# Dataset Card for "PIPPA_TavernFormat"
Converted from the deduped version (pippa_deduped.jsonl) of [PygmalionAI/PIPPA](https://huggingface.co/datasets/PygmalionAI/PIPPA?not-for-all-audiences=true).
Since the CAI format and the Tavern format does not align exactly, there maybe some mismatches between fields, especially character description and personality. |
JonasGeiping/the_pile_WordPiecex32768_97b8e776baafb99c3892e6572a9f51b3 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 22274051772
num_examples: 43166767
download_size: 12187746609
dataset_size: 22274051772
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license: other
multilinguality:
- monolingual
pretty_name: pretokenized,filtered,sorted subset of the Pile
size_categories:
- 10B<n<100B
source_datasets:
- the-pile
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: the-pile-cramming
---
# Dataset Card for the_pile_WordPiecex32768_97b8e776baafb99c3892e6572a9f51b3
This is a preprocessed, tokenized dataset for the cramming-project.
Use only with the tokenizer uploaded here.
This version is `97b8e776baafb99c3892e6572a9f51b3`, which corresponds to a specific dataset construction setup, described below.
The raw data source is the Pile, a 825 GiB diverse, open source language modelling data set that consists of 22 smaller, high-quality
datasets combined together.
## Dataset Description
- **Repository:** https://github.com/JonasGeiping/cramming
- **Paper:** https://arxiv.org/abs/2212.14034
- **Raw Data Source Paper:** [The Pile: An 800GB Dataset of Diverse Text for Language Modeling](https://arxiv.org/abs/2101.00027)
- **Raw Data Source Datasheet:** [Datasheet for the Pile](https://arxiv.org/abs/2201.07311)
### Languages
This dataset is in tokenized English (`EN`).
### Data Splits
This preprocessed subset contains only a train split.
## Dataset Creation
The configuration to create this dataset with the cramming project code (https://github.com/JonasGeiping/cramming) is
```
name: the_pile
defaults:
- sources:
- the_pile
# Preprocessing
normalizer:
force_lowercase: True
strip_accents: True
force_english_keyboard: True
whitespace_escape: False
tokenizer: WordPiece
vocab_size: 32768
# Dataset Formation
seq_length: 128
include_cls_token_in_corpus: False
include_sep_token_in_corpus: True
use_type_ids: False
max_entries_in_raw_dataset: 16e6
max_seq_in_tokenized_dataset: 85e6
# Data Cleaning:
named_entity_simplification: False
remove_whitespaces: False
remove_trash: True
trash_cutoff: 0.25
deduplicate_entries: False
deduplication_threshold: 75
# Data Order:
ordering: sentence-length-curriculum
```
## Considerations for Using the Data
Limitations and bias:
This training data was further filtered and sorted beyond the normal preprocessing.
These modifications were not tested for unintended consequences.
## Additional Information
### Dataset Curators
This dataset is a filtered, sorted and preprocessed subset of the the-Pile made by Jonas Geiping . The original dataset was primarily curated by Leo Gao and Stella Biderman, with assistance from other authors of the Pile paper.
### Licensing Information
Please refer to the specific license depending on the subset you use at https://huggingface.co/datasets/EleutherAI/pile
### Citation Information
Filtered version for the cramming project:
```
@article{geiping_cramming_2022,
title = {Cramming: {{Training}} a {{Language Model}} on a {{Single GPU}} in {{One Day}}},
shorttitle = {Cramming},
author = {Geiping, Jonas and Goldstein, Tom},
year = {2022},
month = dec,
eprint = {2212.14034},
primaryclass = {cs},
publisher = {{arXiv}},
doi = {10.48550/arXiv.2212.14034},
url = {http://arxiv.org/abs/2212.14034},
urldate = {2023-01-10},
archiveprefix = {arxiv},
keywords = {Computer Science - Computation and Language,Computer Science - Machine Learning},
journal = {arxiv:2212.14034[cs]}
}
```
Original Data Curation:
```
@article{gao2020pile,
title={The {P}ile: An 800{GB} dataset of diverse text for language modeling},
author={Gao, Leo and Biderman, Stella and Black, Sid and Golding, Laurence and Hoppe, Travis and Foster, Charles and Phang, Jason and He, Horace and Thite, Anish and Nabeshima, Noa and others},
journal={arXiv preprint arXiv:2101.00027},
year={2020}
}
@article{biderman2022datasheet,
title={Datasheet for the pile},
author={Biderman, Stella and Bicheno, Kieran and Gao, Leo},
journal={arXiv preprint arXiv:2201.07311},
year={2022}
}
``` |
dipteshkanojia/llama-2-qe-2023-enmr-da-test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 657819
num_examples: 1086
download_size: 281499
dataset_size: 657819
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- mr
- en
---
# Dataset Card for "llama-2-qe-2023-enmr-da-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vhtran/de-en-2023 | ---
license: cc-by-4.0
---
Purpose: Translate English to German |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.