datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
polinaeterna/int_float | ---
dataset_info:
features:
- name: int
dtype: int64
- name: float
dtype: float64
splits:
- name: train
num_bytes: 1600000000
num_examples: 100000000
download_size: 1169918838
dataset_size: 1600000000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CVasNLPExperiments/StanfordCars_test_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_8041 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 4049742
num_examples: 8041
- name: fewshot_3_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 15120375
num_examples: 8041
- name: fewshot_0__Attributes_ViT_L_14_descriptors_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 4323240
num_examples: 8041
- name: fewshot_1__Attributes_ViT_L_14_descriptors_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 8285003
num_examples: 8041
- name: fewshot_1__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 8232541
num_examples: 8041
- name: fewshot_3__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 16110353
num_examples: 8041
- name: fewshot_3__Attributes_ViT_L_14_descriptors_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 16213718
num_examples: 8041
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 4321120
num_examples: 8041
download_size: 13641398
dataset_size: 76656092
configs:
- config_name: default
data_files:
- split: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
path: data/fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices-*
---
# Dataset Card for "StanfordCars_test_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_8041"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
graphs-datasets/reddit_threads | ---
license: gpl-3.0
task_categories:
- graph-ml
---
# Dataset Card for Reddit threads
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [External Use](#external-use)
- [PyGeometric](#pygeometric)
- [Dataset Structure](#dataset-structure)
- [Data Properties](#data-properties)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **[Homepage](https://snap.stanford.edu/data/reddit_threads.html)**
- **Paper:**: (see citation)
### Dataset Summary
The `Reddit threads` dataset contains 'discussion and non-discussion based threads from Reddit which we collected in May 2018. Nodes are Reddit users who participate in a discussion and links are replies between them' (doc).
### Supported Tasks and Leaderboards
The related task is the binary classification to predict whether a thread is discussion based or not.
## External Use
### PyGeometric
To load in PyGeometric, do the following:
```python
from datasets import load_dataset
from torch_geometric.data import Data
from torch_geometric.loader import DataLoader
dataset_hf = load_dataset("graphs-datasets/<mydataset>")
# For the train set (replace by valid or test as needed)
dataset_pg_list = [Data(graph) for graph in dataset_hf["train"]]
dataset_pg = DataLoader(dataset_pg_list)
```
## Dataset Structure
### Dataset information
- 203,088 graphs
### Data Fields
Each row of a given file is a graph, with:
- `edge_index` (list: 2 x #edges): pairs of nodes constituting edges
- `y` (list: #labels): contains the number of labels available to predict
- `num_nodes` (int): number of nodes of the graph
### Data Splits
This data is not split, and should be used with cross validation. It comes from the PyGeometric version of the dataset.
## Additional Information
### Licensing Information
The dataset has been released under GPL-3.0 license.
### Citation Information
See also [github](https://github.com/benedekrozemberczki/karateclub).
```
@inproceedings{karateclub,
title = {{Karate Club: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs}},
author = {Benedek Rozemberczki and Oliver Kiss and Rik Sarkar},
year = {2020},
pages = {3125–3132},
booktitle = {Proceedings of the 29th ACM International Conference on Information and Knowledge Management (CIKM '20)},
organization = {ACM},
}
``` |
jyang/webshop_inst_goal_pairs_il | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_cola_em_subj_pronoun | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 10089
num_examples: 127
- name: test
num_bytes: 11641
num_examples: 148
- name: train
num_bytes: 89492
num_examples: 1221
download_size: 55864
dataset_size: 111222
---
# Dataset Card for "MULTI_VALUE_cola_em_subj_pronoun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shreevigneshs/iwslt-2023-en-vi-train-split-v1 | ---
dataset_info:
features:
- name: en
dtype: string
- name: vi
dtype: string
- name: vi_annotated
dtype: string
- name: styles
dtype: int64
splits:
- name: train
num_bytes: 293279.0
num_examples: 640
- name: val
num_bytes: 69940.0
num_examples: 160
- name: if_test
num_bytes: 33427.0
num_examples: 80
- name: f_test
num_bytes: 36513.0
num_examples: 80
download_size: 210801
dataset_size: 433159.0
---
# Dataset Card for "iwslt-2023-en-vi-train-split-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Joycean0301/test_dataset | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 192
num_examples: 10
download_size: 1271
dataset_size: 192
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_stsb_serial_verb_go | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4444
num_examples: 26
- name: test
num_bytes: 6195
num_examples: 35
- name: train
num_bytes: 19908
num_examples: 109
download_size: 29977
dataset_size: 30547
---
# Dataset Card for "MULTI_VALUE_stsb_serial_verb_go"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_178 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1140992428.0
num_examples: 222329
download_size: 1169456445
dataset_size: 1140992428.0
---
# Dataset Card for "chunk_178"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mevol/protein_structure_NER_model_v3.1 | ---
license: mit
language:
- en
tags:
- biology
- protein structure
- token classification
configs:
- config_name: protein_structure_NER_model_v3.1
data_files:
- split: train
path: "annotation_IOB/train.tsv"
- split: dev
path: "annotation_IOB/dev.tsv"
- split: test
path: "annotation_IOB/test.tsv"
---
## Overview
This data was used to train model:
https://huggingface.co/mevol/BiomedNLP-PubMedBERT-ProteinStructure-NER-v3.1
There are 20 different entity types in this dataset:
"bond_interaction", "chemical", "complex_assembly", "evidence", "experimental_method", "gene",
"mutant", "oligomeric_state", "protein", "protein_state", "protein_type", "ptm", "residue_name",
"residue_name_number","residue_number", "residue_range", "site", "species", "structure_element",
"taxonomy_domain"
The data prepared as IOB formated input has been used during training, develiopment
and testing. Additional data formats such as JSON and XML as well as CSV files are
also available and are described below.
Annotation was carried out with the free annotation tool TeamTat (https://www.teamtat.org/) and
documents were downloaded as BioC XML before converting them to IOB, annotation only JSON and CSV format.
The number of annotations and sentences in each file is given below:
| document ID | number of annotations in BioC XML | number of annotations in IOB/JSON/CSV | number of sentences |
| --- | --- | --- | --- |
| PMC4850273 | 1129 | 1129 | 205 |
| PMC4784909 | 868 | 868 | 204 |
| PMC4850288 | 717 | 709 | 146 |
| PMC4887326 | 942 | 942 | 152 |
| PMC4833862 | 1044 | 1044 | 192 |
| PMC4832331 | 739 | 718 | 134 |
| PMC4852598 | 1239 | 1228 | 250 |
| PMC4786784 | 1573 | 1573 | 232 |
| PMC4848090 | 1000 | 998 | 192 |
| PMC4792962 | 1297 | 1297 | 256 |
| PMC4841544 | 1460 | 1459 | 274 |
| PMC4772114 | 824 | 824 | 165 |
| PMC4872110 | 1283 | 1283 | 250 |
| PMC4848761 | 888 | 884 | 252 |
| PMC4919469 | 1636 | 1624 | 336 |
| PMC4880283 | 783 | 783 | 166 |
| PMC4968113 | 1245 | 1245 | 292 |
| PMC4937829 | 633 | 633 | 181 |
| PMC4854314 | 498 | 488 | 139 |
| PMC4871749 | 411 | 411 | 79 |
| PMC4869123 | 922 | 922 | 195 |
| PMC4888278 | 580 | 580 | 102 |
| PMC4795551 | 1475 | 1475 | 297 |
| PMC4831588 | 1087 | 1070 | 224 |
| PMC4918766 | 1027 | 1027 | 210 |
| PMC4802042 | 1441 | 1441 | 264 |
| PMC4896748 | 2652 | 2638 | 480 |
| PMC4781976 | 115 | 113 | 24 |
| PMC4802085 | 983 | 983 | 193 |
| PMC4887163 | 856 | 856 | 196|
| PMC4918759 | 803 | 803 | 175 |
| PMC4855620 | 563 | 563 | 122 |
| PMC4822050 | 1521 | 1521 | 249 |
| PMC4822561 | 367 | 366 | 84 |
| PMC4885502 | 577 | 577 | 97 |
| PMC4746701 | 1130 | 1130 | 245 |
| PMC4820378 | 733 | 733 | 170 |
| PMC4773095 | 1323 | 1323 | 252 |
| PMC4857006 | 1358 | 1358 | 249 |
| PMC4774019 | 532 | 530 | 117 |
| total | 40254 | 40149 | 8042 |
Documents and annotations are easiest viewed by using the BioC XML files and opening
them in free annotation tool TeamTat. More about the BioC
format can be found here: https://bioc.sourceforge.net/
## Raw BioC XML files
These are the raw, un-annotated XML files for the publications in the dataset in BioC format.
The files are found in the directory: "raw_BioC_XML".
There is one file for each document and they follow standard naming
"unique PubMedCentral ID"_raw.xml.
## Annotations in IOB format
The IOB formated files can be found in the directory: "annotation_IOB"
The four files are as follows:
* all.tsv --> all sentences and annotations used to create model
"mevol/BiomedNLP-PubMedBERT-ProteinStructure-NER-v3.1"; 8042 sentences
* train.tsv --> training subset of the data; 5629 sentences
* dev.tsv --> development subset of the data; 1206 sentences
* test.tsv --> testing subset of the data; 1207 sentences
The total number of annotations is: 40149
## Annotations in BioC JSON
The BioC formated JSON files of the publications have been downloaded from the annotation
tool TeamTat. The files are found in the directory: "annotated_BioC_JSON"
There is one file for each document and they follow standard naming
"unique PubMedCentral ID"_ann.json
Each document JSON contains the following relevant keys:
* "sourceid" --> giving the numerical part of the unique PubMedCentral ID
* "text" --> containing the complete raw text of the publication as a string
* "denotations" --> containing a list of all the annotations for the text
Each annotation is a dictionary with the following keys:
* "span" --> gives the start and end of the annotatiom span defined by sub keys:
* "begin" --> character start position of annotation
* "end" --> character end position of annotation
* "obj" --> a string containing a number of terms that can be separated by ","; the order
of the terms gives the following: entity type, reference to ontology, annotator,
time stamp
* "id" --> unique annotation ID
Here an example:
```json
[{"sourceid":"4784909",
"sourcedb":"",
"project":"",
"target":"",
"text":"",
"denotations":[{"span":{"begin":24,
"end":34},
"obj":"chemical,CHEBI:,melaniev@ebi.ac.uk,2023-03-21T15:19:42Z",
"id":"4500"},
{"span":{"begin":50,
"end":59},
"obj":"taxonomy_domain,DUMMY:,melaniev@ebi.ac.uk,2023-03-21T15:15:03Z",
"id":"1281"}]
}
]
```
## Annotations in BioC XML
The BioC formated XML files of the publications have been downloaded from the annotation
tool TeamTat. The files are found in the directory: "annotated_BioC_XML"
There is one file for each document and they follow standard naming
"unique PubMedCentral ID_ann.xml
The key XML tags to be able to visualise the annotations in TeamTat as well as extracting
them to create the training data are "passage" and "offset". The "passage" tag encloses a
text passage or paragraph to which the annotations are linked. "Offset" gives the passage/
paragraph offset and allows to determine the character starting and ending postions of the
annotations. The tag "text" encloses the raw text of the passage.
Each annotation in the XML file is tagged as below:
* "annotation id=" --> giving the unique ID of the annotation
* "infon key="type"" --> giving the entity type of the annotation
* "infon key="identifier"" --> giving a reference to an ontology for the annotation
* "infon key="annotator"" --> giving the annotator
* "infon key="updated_at"" --> providing a time stamp for annotation creation/update
* "location" --> start and end character positions for the annotated text span
* "offset" --> start character position as defined by offset value
* "length" --> length of the annotation span; sum of "offset" and "length" creates
the end character position
Here is a basic example of what the BioC XML looks like. Additional tags for document
management are not given. Please refer to the documenttation to find out more.
```xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE collection SYSTEM "BioC.dtd">
<collection>
<source>PMC</source>
<date>20140719</date>
<key>pmc.key</key>
<document>
<id>4784909</id>
<passage>
<offset>0</offset>
<text>The Structural Basis of Coenzyme A Recycling in a Bacterial Organelle</text>
<annotation id="4500">
<infon key="type">chemical</infon>
<infon key="identifier">CHEBI:</infon>
<infon key="annotator">melaniev@ebi.ac.uk</infon>
<infon key="updated_at">2023-03-21T15:19:42Z</infon>
<location offset="24" length="10"/>
<text>Coenzyme A</text>
</annotation>
</passage>
</document>
</collection>
```
## Annotations in CSV
The annotations and the relevant sentences they have been found in have also been made
available as tab-separated CSV files, one for each publication in the dataset. The files can
be found in directory "annotation_CSV". Each file is named as "unique PubMedCentral ID".csv.
The column labels in the CSV files are as follows:
* "anno_start" --> character start position of the annotation
* "anno_end" --> character end position of the annotation
* "anno_text" --> text covered by the annotation
* "entity_type" --> entity type of the annotation
* "sentence" --> sentence text in which the annotation was found
* "section" --> publication section in which the annotation was found
## Annotations in JSON
A combined JSON file was created only containing the relevant sentences and associated
annotations for each publication in the dataset. The file can be found in directory
"annotation_JSON" under the name "annotations.json".
The following keys are used:
* "PMC4850273" --> unique PubMedCentral of the publication
* "annotations" --> list of dictionaries for the relevant, annotated sentences of the
document; each dictionary has the following sub keys
* "sid" --> unique sentence ID
* "sent" --> sentence text as string
* "section" --> publication section the sentence is in
* "ner" --> nested list of annotations; each sublist contains the following items:
start character position, end character position, annotation text,
entity type
Here is an example of a sentence and its annotations:
```json
{"PMC4850273": {"annotations":
[{"sid": 0,
"sent": "Molecular Dissection of Xyloglucan Recognition in a Prominent Human Gut Symbiont",
"section": "TITLE",
"ner": [
[24,34,"Xyloglucan","chemical"],
[62,67,"Human","species"],]
},]
}}
```
|
divi7007/try | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 21631104
num_examples: 528
download_size: 7333858
dataset_size: 21631104
---
# Dataset Card for "try"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibivibiv/alpaca_tasksource3 | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 135139002
num_examples: 253971
download_size: 76680453
dataset_size: 135139002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ceceri/LXY | ---
language:
- zh
size_categories:
- n<1K
--- |
mutual_friends | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- dialogue-modeling
paperswithcode_id: mutualfriends
pretty_name: MutualFriends
dataset_info:
features:
- name: uuid
dtype: string
- name: scenario_uuid
dtype: string
- name: scenario_alphas
sequence: float32
- name: scenario_attributes
sequence:
- name: unique
dtype: bool_
- name: value_type
dtype: string
- name: name
dtype: string
- name: scenario_kbs
sequence:
sequence:
sequence:
sequence: string
- name: agents
struct:
- name: '1'
dtype: string
- name: '0'
dtype: string
- name: outcome_reward
dtype: int32
- name: events
struct:
- name: actions
sequence: string
- name: start_times
sequence: float32
- name: data_messages
sequence: string
- name: data_selects
sequence:
- name: attributes
sequence: string
- name: values
sequence: string
- name: agents
sequence: int32
- name: times
sequence: float32
config_name: plain_text
splits:
- name: train
num_bytes: 26979472
num_examples: 8967
- name: test
num_bytes: 3327158
num_examples: 1107
- name: validation
num_bytes: 3267881
num_examples: 1083
download_size: 41274578
dataset_size: 33574511
---
# Dataset Card for MutualFriends
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [COCOA](https://stanfordnlp.github.io/cocoa/)
- **Repository:** [Github repository](https://github.com/stanfordnlp/cocoa)
- **Paper:** [Learning Symmetric Collaborative Dialogue Agents with Dynamic Knowledge Graph Embeddings (ACL 2017)](https://arxiv.org/abs/1704.07130)
- **Codalab**: [Codalab](https://worksheets.codalab.org/worksheets/0xc757f29f5c794e5eb7bfa8ca9c945573/)
### Dataset Summary
Our goal is to build systems that collaborate with people by exchanging information through natural language and reasoning over structured knowledge base. In the MutualFriend task, two agents, A and B, each have a private knowledge base, which contains a list of friends with multiple attributes (e.g., name, school, major, etc.). The agents must chat with each other to find their unique mutual friend.
### Supported Tasks and Leaderboards
We consider two agents, each with a private knowledge base of items, who must communicate their knowledge to achieve a common goal. Specifically, we designed the MutualFriends task (see the figure below). Each agent has a list of friends with attributes like school, major etc. They must chat with each other to find the unique mutual friend.
### Languages
The text in the dataset is in English. The associated BCP-47 code is `en`.
## Dataset Structure
### Data Instances
An example looks like this.
```
{
'uuid': 'C_423324a5fff045d78bef75a6f295a3f4'
'scenario_uuid': 'S_hvmRM4YNJd55ecT5',
'scenario_alphas': [0.30000001192092896, 1.0, 1.0],
'scenario_attributes': {
'name': ['School', 'Company', 'Location Preference'],
'unique': [False, False, False],
'value_type': ['school', 'company', 'loc_pref']
},
'scenario_kbs': [
[
[['School', 'Company', 'Location Preference'], ['Longwood College', 'Alton Steel', 'indoor']],
[['School', 'Company', 'Location Preference'], ['Salisbury State University', 'Leonard Green & Partners', 'indoor']],
[['School', 'Company', 'Location Preference'], ['New Mexico Highlands University', 'Crazy Eddie', 'indoor']],
[['School', 'Company', 'Location Preference'], ['Rhodes College', "Tully's Coffee", 'indoor']],
[['School', 'Company', 'Location Preference'], ['Sacred Heart University', 'AMR Corporation', 'indoor']],
[['School', 'Company', 'Location Preference'], ['Salisbury State University', 'Molycorp', 'indoor']],
[['School', 'Company', 'Location Preference'], ['New Mexico Highlands University', 'The Hartford Financial Services Group', 'indoor']],
[['School', 'Company', 'Location Preference'], ['Sacred Heart University', 'Molycorp', 'indoor']],
[['School', 'Company', 'Location Preference'], ['Babson College', 'The Hartford Financial Services Group', 'indoor']]
],
[
[['School', 'Company', 'Location Preference'], ['National Technological University', 'Molycorp', 'indoor']],
[['School', 'Company', 'Location Preference'], ['Fairmont State College', 'Leonard Green & Partners', 'outdoor']],
[['School', 'Company', 'Location Preference'], ['Johnson C. Smith University', 'Data Resources Inc.', 'outdoor']],
[['School', 'Company', 'Location Preference'], ['Salisbury State University', 'Molycorp', 'indoor']],
[['School', 'Company', 'Location Preference'], ['Fairmont State College', 'Molycorp', 'outdoor']],
[['School', 'Company', 'Location Preference'], ['University of South Carolina - Aiken', 'Molycorp', 'indoor']],
[['School', 'Company', 'Location Preference'], ['University of South Carolina - Aiken', 'STX', 'outdoor']],
[['School', 'Company', 'Location Preference'], ['National Technological University', 'STX', 'outdoor']],
[['School', 'Company', 'Location Preference'], ['Johnson C. Smith University', 'Rockstar Games', 'indoor']]
]
],
'agents': {
'0': 'human',
'1': 'human'
},
'outcome_reward': 1,
'events': {
'actions': ['message', 'message', 'message', 'message', 'select', 'select'],
'agents': [1, 1, 0, 0, 1, 0],
'data_messages': ['Hello', 'Do you know anyone who works at Molycorp?', 'Hi. All of my friends like the indoors.', 'Ihave two friends that work at Molycorp. They went to Salisbury and Sacred Heart.', '', ''],
'data_selects': {
'attributes': [
[], [], [], [], ['School', 'Company', 'Location Preference'], ['School', 'Company', 'Location Preference']
],
'values': [
[], [], [], [], ['Salisbury State University', 'Molycorp', 'indoor'], ['Salisbury State University', 'Molycorp', 'indoor']
]
},
'start_times': [-1.0, -1.0, -1.0, -1.0, -1.0, -1.0],
'times': [1480737280.0, 1480737280.0, 1480737280.0, 1480737280.0, 1480737280.0, 1480737280.0]
},
}
```
### Data Fields
- `uuid`: example id.
- `scenario_uuid`: scenario id.
- `scenario_alphas`: scenario alphas.
- `scenario_attributes`: all the attributes considered in the scenario. The dictionaries are liniearized: to reconstruct the dictionary of attribute i-th, one should extract the i-th elements of `unique`, `value_type` and `name`.
- `unique`: bool.
- `value_type`: code/type of the attribute.
- `name`: name of the attribute.
- `scenario_kbs`: descriptions of the persons present in the two users' databases. List of two (one for each user in the dialogue). `scenario_kbs[i]` is a list of persons. Each person is represented as two lists (one for attribute names and the other for attribute values). The j-th element of attribute names corresponds to the j-th element of attribute values (linearized dictionary).
- `agents`: the two users engaged in the dialogue.
- `outcome_reward`: reward of the present dialogue.
- `events`: dictionary describing the dialogue. The j-th element of each sub-element of the dictionary describes the turn along the axis of the sub-element.
- `actions`: type of turn (either `message` or `select`).
- `agents`: who is talking? Agent 1 or 0?
- `data_messages`: the string exchanged if `action==message`. Otherwise, empty string.
- `data_selects`: selection of the user if `action==select`. Otherwise, empty selection/dictionary.
- `start_times`: always -1 in these data.
- `times`: sending time.
### Data Splits
There are 8967 dialogues for training, 1083 for validation and 1107 for testing.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{he-etal-2017-learning,
title = "Learning Symmetric Collaborative Dialogue Agents with Dynamic Knowledge Graph Embeddings",
author = "He, He and
Balakrishnan, Anusha and
Eric, Mihail and
Liang, Percy",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P17-1162",
doi = "10.18653/v1/P17-1162",
pages = "1766--1776",
abstract = "We study a \textit{symmetric collaborative dialogue} setting in which two agents, each with private knowledge, must strategically communicate to achieve a common goal. The open-ended dialogue state in this setting poses new challenges for existing dialogue systems. We collected a dataset of 11K human-human dialogues, which exhibits interesting lexical, semantic, and strategic elements. To model both structured knowledge and unstructured language, we propose a neural model with dynamic knowledge graph embeddings that evolve as the dialogue progresses. Automatic and human evaluations show that our model is both more effective at achieving the goal and more human-like than baseline neural and rule-based models.",
}
```
### Contributions
Thanks to [@VictorSanh](https://github.com/VictorSanh) for adding this dataset. |
imvladikon/opus_en_he | ---
dataset_info:
features:
- name: sentence_en
dtype: string
- name: sentence_he
dtype: string
splits:
- name: train
num_bytes: 91159631
num_examples: 1000000
- name: validation
num_bytes: 209438
num_examples: 2000
- name: test
num_bytes: 208467
num_examples: 2000
download_size: 61132866
dataset_size: 91577536
---
# Dataset Card for "opus_en_he"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Sample
```json
{'sentence_en': 'Hey, guys.', 'sentence_he': "היי, חבר'ה."}
``` |
peterwz/wiki-length | ---
license: apache-2.0
dataset_info:
features:
- name: file_name
dtype: string
- name: original
dtype: string
- name: summary
dtype: string
- name: compression_ratio
dtype: float64
splits:
- name: train
num_bytes: 2797346
num_examples: 119
download_size: 1582308
dataset_size: 2797346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/okazaki_yasuha_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of okazaki_yasuha/岡崎泰葉 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of okazaki_yasuha/岡崎泰葉 (THE iDOLM@STER: Cinderella Girls), containing 70 images and their tags.
The core tags of this character are `short_hair, blue_hair, black_hair, bangs, blunt_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 70 | 65.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okazaki_yasuha_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 70 | 45.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okazaki_yasuha_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 152 | 87.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okazaki_yasuha_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 70 | 60.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okazaki_yasuha_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 152 | 114.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okazaki_yasuha_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/okazaki_yasuha_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, purple_eyes, smile, school_uniform, simple_background, open_mouth, white_background, glasses, medium_breasts, navel, necktie, skirt |
| 1 | 12 |  |  |  |  |  | 1girl, smile, open_mouth, solo, black_eyes, dress, gloves, card_(medium), character_name, flower, frills, gem_(symbol), looking_at_viewer, microphone, choker, hair_ornament |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | purple_eyes | smile | school_uniform | simple_background | open_mouth | white_background | glasses | medium_breasts | navel | necktie | skirt | black_eyes | dress | gloves | card_(medium) | character_name | flower | frills | gem_(symbol) | microphone | choker | hair_ornament |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------------|:--------|:-----------------|:--------------------|:-------------|:-------------------|:----------|:-----------------|:--------|:----------|:--------|:-------------|:--------|:---------|:----------------|:-----------------|:---------|:---------|:---------------|:-------------|:---------|:----------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | | X | | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
freshpearYoon/train_free_30 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604506536
num_examples: 10000
download_size: 1234977213
dataset_size: 9604506536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FaalSa/data2 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 17309
num_examples: 1
- name: validation
num_bytes: 17789
num_examples: 1
- name: test
num_bytes: 18269
num_examples: 1
download_size: 8287
dataset_size: 53367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
joey234/mmlu-high_school_psychology-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 9825
num_examples: 5
- name: test
num_bytes: 6256568
num_examples: 545
download_size: 482916
dataset_size: 6266393
---
# Dataset Card for "mmlu-high_school_psychology-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_marker_before_sent_train_400_eval_40 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 2782863
num_examples: 2428
- name: validation
num_bytes: 215266
num_examples: 200
download_size: 0
dataset_size: 2998129
---
# Dataset Card for "find_marker_before_sent_train_400_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
azrai99/data-scientist-jobstreet-dataset | ---
license: apache-2.0
size_categories:
- n<1K
---
# Jobstreet Webscraping

The data was scraped from jobstreet malaysia site with search keyword *data scientist* using beautifulsoup4 object. |
open-llm-leaderboard/details_venkycs__ZySec-1B | ---
pretty_name: Evaluation run of venkycs/ZySec-1B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [venkycs/ZySec-1B](https://huggingface.co/venkycs/ZySec-1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_venkycs__ZySec-1B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T19:58:01.944130](https://huggingface.co/datasets/open-llm-leaderboard/details_venkycs__ZySec-1B/blob/main/results_2024-01-27T19-58-01.944130.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2578838995098124,\n\
\ \"acc_stderr\": 0.030721943510218043,\n \"acc_norm\": 0.25894411476824014,\n\
\ \"acc_norm_stderr\": 0.0314742515286692,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3565914064488495,\n\
\ \"mc2_stderr\": 0.014002389029353163\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3583617747440273,\n \"acc_stderr\": 0.014012883334859866,\n\
\ \"acc_norm\": 0.3839590443686007,\n \"acc_norm_stderr\": 0.014212444980651889\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4649472216689902,\n\
\ \"acc_stderr\": 0.004977504446609,\n \"acc_norm\": 0.6153156741684923,\n\
\ \"acc_norm_stderr\": 0.004855262903270809\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.03302789859901717,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.03302789859901717\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.03197565821032499,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.03197565821032499\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.031862098516411426,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.031862098516411426\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416542,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416542\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628813,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628813\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n\
\ \"acc_stderr\": 0.024472243840895518,\n \"acc_norm\": 0.24516129032258063,\n\
\ \"acc_norm_stderr\": 0.024472243840895518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733555,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733555\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.022139081103971534,\n\
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.022139081103971534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.028359620870533946,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.028359620870533946\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.018224078117299085,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.018224078117299085\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.22784810126582278,\n \"acc_stderr\": 0.02730348459906942,\n\
\ \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.02730348459906942\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749482,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749482\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n\
\ \"acc_stderr\": 0.01609530296987856,\n \"acc_norm\": 0.2822477650063857,\n\
\ \"acc_norm_stderr\": 0.01609530296987856\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757177,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331161,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331161\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.025122637608816657,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.025122637608816657\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.21631205673758866,\n \"acc_stderr\": 0.024561720560562786,\n \
\ \"acc_norm\": 0.21631205673758866,\n \"acc_norm_stderr\": 0.024561720560562786\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23598435462842243,\n\
\ \"acc_stderr\": 0.010844802669662689,\n \"acc_norm\": 0.23598435462842243,\n\
\ \"acc_norm_stderr\": 0.010844802669662689\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22426470588235295,\n \"acc_stderr\": 0.02533684856333236,\n\
\ \"acc_norm\": 0.22426470588235295,\n \"acc_norm_stderr\": 0.02533684856333236\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n\
\ \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3565914064488495,\n\
\ \"mc2_stderr\": 0.014002389029353163\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6132596685082873,\n \"acc_stderr\": 0.013687214761883039\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723890076\n }\n}\n```"
repo_url: https://huggingface.co/venkycs/ZySec-1B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|arc:challenge|25_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|gsm8k|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hellaswag|10_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T19-58-01.944130.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T19-58-01.944130.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- '**/details_harness|winogrande|5_2024-01-27T19-58-01.944130.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T19-58-01.944130.parquet'
- config_name: results
data_files:
- split: 2024_01_27T19_58_01.944130
path:
- results_2024-01-27T19-58-01.944130.parquet
- split: latest
path:
- results_2024-01-27T19-58-01.944130.parquet
---
# Dataset Card for Evaluation run of venkycs/ZySec-1B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [venkycs/ZySec-1B](https://huggingface.co/venkycs/ZySec-1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_venkycs__ZySec-1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T19:58:01.944130](https://huggingface.co/datasets/open-llm-leaderboard/details_venkycs__ZySec-1B/blob/main/results_2024-01-27T19-58-01.944130.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2578838995098124,
"acc_stderr": 0.030721943510218043,
"acc_norm": 0.25894411476824014,
"acc_norm_stderr": 0.0314742515286692,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3565914064488495,
"mc2_stderr": 0.014002389029353163
},
"harness|arc:challenge|25": {
"acc": 0.3583617747440273,
"acc_stderr": 0.014012883334859866,
"acc_norm": 0.3839590443686007,
"acc_norm_stderr": 0.014212444980651889
},
"harness|hellaswag|10": {
"acc": 0.4649472216689902,
"acc_stderr": 0.004977504446609,
"acc_norm": 0.6153156741684923,
"acc_norm_stderr": 0.004855262903270809
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.03302789859901717,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.03302789859901717
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.03197565821032499,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.03197565821032499
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.031862098516411426,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.031862098516411426
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416542,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416542
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628813,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628813
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733555,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733555
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.022139081103971534,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.022139081103971534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.028359620870533946,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.028359620870533946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.018224078117299085,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.018224078117299085
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.02730348459906942,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.02730348459906942
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749482,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749482
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.01609530296987856,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.01609530296987856
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331161,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331161
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.025122637608816657,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.025122637608816657
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.21631205673758866,
"acc_stderr": 0.024561720560562786,
"acc_norm": 0.21631205673758866,
"acc_norm_stderr": 0.024561720560562786
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23598435462842243,
"acc_stderr": 0.010844802669662689,
"acc_norm": 0.23598435462842243,
"acc_norm_stderr": 0.010844802669662689
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22426470588235295,
"acc_stderr": 0.02533684856333236,
"acc_norm": 0.22426470588235295,
"acc_norm_stderr": 0.02533684856333236
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3565914064488495,
"mc2_stderr": 0.014002389029353163
},
"harness|winogrande|5": {
"acc": 0.6132596685082873,
"acc_stderr": 0.013687214761883039
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890076
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anan-2024/twitter_dataset_1713095484 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2320262
num_examples: 6187
download_size: 1157019
dataset_size: 2320262
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/tanikaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tanikaze/谷風 (Kantai Collection)
This is the dataset of tanikaze/谷風 (Kantai Collection), containing 261 images and their tags.
The core tags of this character are `short_hair, hairband, black_hair, brown_eyes, brown_hair, white_hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 261 | 171.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanikaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 261 | 129.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanikaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 508 | 242.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanikaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 261 | 163.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanikaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 508 | 291.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanikaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tanikaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, pleated_skirt, serafuku, short_sleeves, solo, white_gloves, white_thighhighs, yellow_neckerchief, grey_skirt, smile, blue_sailor_collar, full_body, looking_at_viewer, white_background, simple_background, standing, open_mouth, white_shirt |
| 1 | 21 |  |  |  |  |  | 1girl, pleated_skirt, serafuku, solo, neckerchief, white_gloves, white_thighhighs, looking_at_viewer, open_mouth, short_sleeves, white_background, :d, machinery, sitting |
| 2 | 12 |  |  |  |  |  | 1girl, pleated_skirt, serafuku, short_sleeves, solo, white_thighhighs, green_panties, simple_background, white_shirt, blouse, blue_sailor_collar, grey_skirt, white_background, yellow_neckerchief, blush, sitting, white_gloves, looking_at_viewer, small_breasts, spread_legs |
| 3 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, serafuku, solo, white_thighhighs, torn_skirt, torn_thighhighs, white_background, white_gloves, sitting, small_breasts, torn_shirt, open_mouth, simple_background, tears, tongue_out, white_panties |
| 4 | 5 |  |  |  |  |  | 1girl, serafuku, skirt, solo, tears, torn_thighhighs, white_thighhighs, blush, looking_at_viewer, open_mouth, smile, navel, sitting, white_gloves |
| 5 | 5 |  |  |  |  |  | 1girl, blush, hetero, nipples, 1boy, closed_eyes, nude, open_mouth, thighhighs, bar_censor, cum_in_pussy, navel, small_breasts, solo_focus, vaginal, cowgirl_position, girl_on_top, hair_between_eyes, heart, penis, sex_from_behind, simple_background, sweat, tears, white_background, white_gloves |
| 6 | 5 |  |  |  |  |  | 1girl, playboy_bunny, rabbit_ears, solo, strapless_leotard, wrist_cuffs, detached_collar, fake_animal_ears, simple_background, white_background, black_leotard, black_pantyhose, looking_at_viewer, rabbit_tail, small_breasts, alternate_costume, fishnet_pantyhose, full_body, hand_on_hip, high_heels, smile, white_leotard, yellow_bowtie |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | pleated_skirt | serafuku | short_sleeves | solo | white_gloves | white_thighhighs | yellow_neckerchief | grey_skirt | smile | blue_sailor_collar | full_body | looking_at_viewer | white_background | simple_background | standing | open_mouth | white_shirt | neckerchief | :d | machinery | sitting | green_panties | blouse | blush | small_breasts | spread_legs | navel | torn_skirt | torn_thighhighs | torn_shirt | tears | tongue_out | white_panties | skirt | hetero | nipples | 1boy | closed_eyes | nude | thighhighs | bar_censor | cum_in_pussy | solo_focus | vaginal | cowgirl_position | girl_on_top | hair_between_eyes | heart | penis | sex_from_behind | sweat | playboy_bunny | rabbit_ears | strapless_leotard | wrist_cuffs | detached_collar | fake_animal_ears | black_leotard | black_pantyhose | rabbit_tail | alternate_costume | fishnet_pantyhose | hand_on_hip | high_heels | white_leotard | yellow_bowtie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:-----------|:----------------|:-------|:---------------|:-------------------|:---------------------|:-------------|:--------|:---------------------|:------------|:--------------------|:-------------------|:--------------------|:-----------|:-------------|:--------------|:--------------|:-----|:------------|:----------|:----------------|:---------|:--------|:----------------|:--------------|:--------|:-------------|:------------------|:-------------|:--------|:-------------|:----------------|:--------|:---------|:----------|:-------|:--------------|:-------|:-------------|:-------------|:---------------|:-------------|:----------|:-------------------|:--------------|:--------------------|:--------|:--------|:------------------|:--------|:----------------|:--------------|:--------------------|:--------------|:------------------|:-------------------|:----------------|:------------------|:--------------|:--------------------|:--------------------|:--------------|:-------------|:----------------|:----------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | X | X | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | X | X | X | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | | X | X | X | | | | | | X | X | X | | X | | | | | X | | | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | X | X | X | | | X | | | X | | | | X | | | | | X | | | X | | | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | X | | | | | | | | X | X | | X | | | | | | | | X | X | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | X | | | | | X | | X | X | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf | ---
pretty_name: Evaluation run of lizpreciatior/lzlv_70b_fp16_hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lizpreciatior/lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T11:08:18.401041](https://huggingface.co/datasets/open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf/blob/main/results_2023-10-24T11-08-18.401041.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.040058724832214766,\n\
\ \"em_stderr\": 0.002008216561907643,\n \"f1\": 0.10676174496644267,\n\
\ \"f1_stderr\": 0.002328625422990624,\n \"acc\": 0.5717896950225979,\n\
\ \"acc_stderr\": 0.011591305235224383\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.040058724832214766,\n \"em_stderr\": 0.002008216561907643,\n\
\ \"f1\": 0.10676174496644267,\n \"f1_stderr\": 0.002328625422990624\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30932524639878695,\n \
\ \"acc_stderr\": 0.012731710925078124\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370642\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T11_08_18.401041
path:
- '**/details_harness|drop|3_2023-10-24T11-08-18.401041.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T11-08-18.401041.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T11_08_18.401041
path:
- '**/details_harness|gsm8k|5_2023-10-24T11-08-18.401041.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T11-08-18.401041.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T11_08_18.401041
path:
- '**/details_harness|winogrande|5_2023-10-24T11-08-18.401041.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T11-08-18.401041.parquet'
- config_name: results
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- results_2023-10-10T17-25-31.421123.parquet
- split: 2023_10_24T11_08_18.401041
path:
- results_2023-10-24T11-08-18.401041.parquet
- split: latest
path:
- results_2023-10-24T11-08-18.401041.parquet
---
# Dataset Card for Evaluation run of lizpreciatior/lzlv_70b_fp16_hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lizpreciatior/lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T11:08:18.401041](https://huggingface.co/datasets/open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf/blob/main/results_2023-10-24T11-08-18.401041.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.040058724832214766,
"em_stderr": 0.002008216561907643,
"f1": 0.10676174496644267,
"f1_stderr": 0.002328625422990624,
"acc": 0.5717896950225979,
"acc_stderr": 0.011591305235224383
},
"harness|drop|3": {
"em": 0.040058724832214766,
"em_stderr": 0.002008216561907643,
"f1": 0.10676174496644267,
"f1_stderr": 0.002328625422990624
},
"harness|gsm8k|5": {
"acc": 0.30932524639878695,
"acc_stderr": 0.012731710925078124
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370642
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gaganpathre/amgerindaf | ---
license: mit
---
|
communityai/gretelai___synthetic_text_to_sql-20k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 16860322.2
num_examples: 20000
download_size: 6011892
dataset_size: 16860322.2
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mahdibaghbanzadeh/GUE_prom_prom_300_all | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: labels
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 14775072
num_examples: 47356
- name: val
num_bytes: 1847040
num_examples: 5920
- name: test
num_bytes: 1847040
num_examples: 5920
download_size: 8664009
dataset_size: 18469152
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
Einstellung/demo-salaries | ---
language:
- en
- es
license: apache-2.0
tags:
- tabular
- "2023"
- Jobs
- Computer Science
language_creators:
- crowdsourced
pretty_name: pretty_name
size_categories:
- n<1k
source_datasets:
- aijobs.net
task_categories:
- tabular-regression
- tabular-classification
task_ids:
- tabular-single-column-regression
- tabular-multi-label-classification
# configs: # Optional for datasets with multiple configurations like glue.
# - sst2 # Example for glue: sst2
# - cola # Example for glue: cola
dataset_info:
features:
- name: work_year
dtype: int64
- name: experience_level
dtype: string
- name: employment_type
dtype: string
- name: job_title
dtype: string
- name: salary
dtype: int64
- name: salary_currency
dtype: string
- name: salary_in_usd
dtype: int64
- name: employee_residence
dtype: string
- name: remote_ratio
dtype: int64
- name: company_location
dtype: string
- name: company_size
dtype: string
config_name: sst2
splits:
- name: train
num_bytes: 79317110
num_examples: 87599
download_size: 35142551
dataset_size: 89789763
---
## Dataset Description
- **Homepage:** [Add homepage URL here if available (unless it's a GitHub repository)]()
- **Repository:** [If the dataset is hosted on github or has a github homepage, add URL here]()
- **Paper:** [If the dataset was introduced by a paper or there was a paper written describing the dataset, add URL here (landing page for Arxiv paper preferred)]()
- **Leaderboard:** [If the dataset supports an active leaderboard, add link here]()
- **Point of Contact:** [If known, name and email of at least one person the reader can contact for questions about the dataset.]()
### Dataset Summary
Briefly summarize the dataset, its intended use and the supported tasks. Give an overview of how and why the dataset was created. The summary should explicitly mention the languages present in the dataset (possibly in broad terms, e.g. *translations between several pairs of European languages*), and describe the domain, topic, or genre covered.
### Supported Tasks and Leaderboards
For each of the tasks tagged for this dataset, give a brief description of the tag, metrics, and suggested models (with a link to their HuggingFace implementation if available). Give a similar description of tasks that were not covered by the structured tag set (repace the `task-category-tag` with an appropriate `other:other-task-name`).
- `task-category-tag`: The dataset can be used to train a model for [TASK NAME], which consists in [TASK DESCRIPTION]. Success on this task is typically measured by achieving a *high/low* [metric name](https://huggingface.co/metrics/metric_name). The ([model name](https://huggingface.co/model_name) or [model class](https://huggingface.co/transformers/model_doc/model_class.html)) model currently achieves the following score. *[IF A LEADERBOARD IS AVAILABLE]:* This task has an active leaderboard which can be found at [leaderboard url]() and ranks models based on [metric name](https://huggingface.co/metrics/metric_name) while also reporting [other metric name](https://huggingface.co/metrics/other_metric_name).
### Languages
Provide a brief overview of the languages represented in the dataset. Describe relevant details about specifics of the language such as whether it is social media text, African American English,...
When relevant, please provide [BCP-47 codes](https://tools.ietf.org/html/bcp47), which consist of a [primary language subtag](https://tools.ietf.org/html/bcp47#section-2.2.1), with a [script subtag](https://tools.ietf.org/html/bcp47#section-2.2.3) and/or [region subtag](https://tools.ietf.org/html/bcp47#section-2.2.4) if available.
## Dataset Structure
### Data Instances
Provide an JSON-formatted example and brief description of a typical instance in the dataset. If available, provide a link to further examples.
```
{
'example_field': ...,
...
}
```
Provide any additional information that is not covered in the other sections about the data here. In particular describe any relationships between data points and if these relationships are made explicit.
### Data Fields
List and describe the fields present in the dataset. Mention their data type, and whether they are used as input or output in any of the tasks the dataset currently supports. If the data has span indices, describe their attributes, such as whether they are at the character level or word level, whether they are contiguous or not, etc. If the datasets contains example IDs, state whether they have an inherent meaning, such as a mapping to other datasets or pointing to relationships between data points.
- `example_field`: description of `example_field`
Note that the descriptions can be initialized with the **Show Markdown Data Fields** output of the [Datasets Tagging app](https://huggingface.co/spaces/huggingface/datasets-tagging), you will then only need to refine the generated descriptions.
### Data Splits
Describe and name the splits in the dataset if there are more than one.
Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g. if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here.
Provide the sizes of each split. As appropriate, provide any descriptive statistics for the features, such as average length. For example:
| | train | validation | test |
|-------------------------|------:|-----------:|-----:|
| Input Sentences | | | |
| Average Sentence Length | | | |
## Dataset Creation
### Curation Rationale
What need motivated the creation of this dataset? What are some of the reasons underlying the major choices involved in putting it together?
### Source Data
This section describes the source data (e.g. news text and headlines, social media posts, translated sentences,...)
#### Initial Data Collection and Normalization
Describe the data collection process. Describe any criteria for data selection or filtering. List any key words or search terms used. If possible, include runtime information for the collection process.
If data was collected from other pre-existing datasets, link to source here and to their [Hugging Face version](https://huggingface.co/datasets/dataset_name).
If the data was modified or normalized after being collected (e.g. if the data is word-tokenized), describe the process and the tools used.
#### Who are the source language producers?
State whether the data was produced by humans or machine generated. Describe the people or systems who originally created the data.
If available, include self-reported demographic or identity information for the source data creators, but avoid inferring this information. Instead state that this information is unknown. See [Larson 2017](https://www.aclweb.org/anthology/W17-1601.pdf) for using identity categories as a variables, particularly gender.
Describe the conditions under which the data was created (for example, if the producers were crowdworkers, state what platform was used, or if the data was found, what website the data was found on). If compensation was provided, include that information here.
Describe other people represented or mentioned in the data. Where possible, link to references for the information.
### Annotations
If the dataset contains annotations which are not part of the initial data collection, describe them in the following paragraphs.
#### Annotation process
If applicable, describe the annotation process and any tools used, or state otherwise. Describe the amount of data annotated, if not all. Describe or reference annotation guidelines provided to the annotators. If available, provide interannotator statistics. Describe any annotation validation processes.
#### Who are the annotators?
If annotations were collected for the source data (such as class labels or syntactic parses), state whether the annotations were produced by humans or machine generated.
Describe the people or systems who originally created the annotations and their selection criteria if applicable.
If available, include self-reported demographic or identity information for the annotators, but avoid inferring this information. Instead state that this information is unknown. See [Larson 2017](https://www.aclweb.org/anthology/W17-1601.pdf) for using identity categories as a variables, particularly gender.
Describe the conditions under which the data was annotated (for example, if the annotators were crowdworkers, state what platform was used, or if the data was found, what website the data was found on). If compensation was provided, include that information here.
### Personal and Sensitive Information
State whether the dataset uses identity categories and, if so, how the information is used. Describe where this information comes from (i.e. self-reporting, collecting from profiles, inferring, etc.). See [Larson 2017](https://www.aclweb.org/anthology/W17-1601.pdf) for using identity categories as a variables, particularly gender. State whether the data is linked to individuals and whether those individuals can be identified in the dataset, either directly or indirectly (i.e., in combination with other data).
State whether the dataset contains other data that might be considered sensitive (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history).
If efforts were made to anonymize the data, describe the anonymization process.
## Considerations for Using the Data
### Social Impact of Dataset
Please discuss some of the ways you believe the use of this dataset will impact society.
The statement should include both positive outlooks, such as outlining how technologies developed through its use may improve people's lives, and discuss the accompanying risks. These risks may range from making important decisions more opaque to people who are affected by the technology, to reinforcing existing harmful biases (whose specifics should be discussed in the next section), among other considerations.
Also describe in this section if the proposed dataset contains a low-resource or under-represented language. If this is the case or if this task has any impact on underserved communities, please elaborate here.
### Discussion of Biases
Provide descriptions of specific biases that are likely to be reflected in the data, and state whether any steps were taken to reduce their impact.
For Wikipedia text, see for example [Dinan et al 2020 on biases in Wikipedia (esp. Table 1)](https://arxiv.org/abs/2005.00614), or [Blodgett et al 2020](https://www.aclweb.org/anthology/2020.acl-main.485/) for a more general discussion of the topic.
If analyses have been run quantifying these biases, please add brief summaries and links to the studies here.
### Other Known Limitations
If studies of the datasets have outlined other limitations of the dataset, such as annotation artifacts, please outline and cite them here.
## Additional Information
### Dataset Curators
List the people involved in collecting the dataset and their affiliation(s). If funding information is known, include it here.
### Licensing Information
Provide the license and link to the license webpage if available.
### Citation Information
Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@article{article_id,
author = {Author List},
title = {Dataset Paper Title},
journal = {Publication Venue},
year = {2525}
}
```
If the dataset has a [DOI](https://www.doi.org/), please provide it here.
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
yangtao9009/Unsplash2K | ---
license: apache-2.0
---
|
BangumiBase/ginnosaji | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Gin No Saji
This is the image base of bangumi Gin no Saji, we detected 27 characters, 3590 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 18 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 700 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 181 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 97 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 35 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 44 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 1308 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 64 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 22 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 56 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 14 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 8 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 28 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 81 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 41 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 48 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 23 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 12 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 31 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 11 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 80 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 58 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 65 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 10 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 490 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 8 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 57 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
BrunoGR/Emo_support_prompted | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: texto
dtype: string
- name: etiqueta
dtype: string
- name: Prompt
dtype: string
splits:
- name: train
num_bytes: 37475053
num_examples: 112347
- name: test
num_bytes: 9344955
num_examples: 27445
- name: validation
num_bytes: 674281
num_examples: 2001
download_size: 22190800
dataset_size: 47494289
---
# Dataset Card for "Emo_support_prompted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/VALUE_wnli_dey_it | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 878
num_examples: 4
- name: test
num_bytes: 1264
num_examples: 4
- name: train
num_bytes: 4938
num_examples: 26
download_size: 12662
dataset_size: 7080
---
# Dataset Card for "VALUE_wnli_dey_it"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mlfoundations/datacomp_xlarge | ---
license: cc-by-4.0
---
## DataComp XLarge Pool
This repository contains metadata files for the xlarge pool of DataComp. For details on how to use the metadata, please visit [our website](https://www.datacomp.ai/) and our [github repository](https://github.com/mlfoundations/datacomp).
We distribute the image url-text samples and metadata under a standard Creative Common CC-BY-4.0 license. The individual images are under their own copyrights.
## Terms and Conditions
We have terms of service that are similar to those adopted by HuggingFace (https://huggingface.co/terms-of-service), which covers their dataset library. Specifically, any content you download, access or use from our index, is at your own risk and subject to the terms of service or copyright limitations accompanying such content. The image url-text index, which is a research artifact, is provided as is. By using said index, you assume all risks, including but not limited to, liabilities related to image downloading and storage. |
open-llm-leaderboard/details_quantumaikr__llama-2-7b-hf-guanaco-1k | ---
pretty_name: Evaluation run of quantumaikr/llama-2-7b-hf-guanaco-1k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/llama-2-7b-hf-guanaco-1k](https://huggingface.co/quantumaikr/llama-2-7b-hf-guanaco-1k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__llama-2-7b-hf-guanaco-1k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T19:26:34.289625](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-7b-hf-guanaco-1k/blob/main/results_2023-10-17T19-26-34.289625.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002726510067114094,\n\
\ \"em_stderr\": 0.0005340111700415914,\n \"f1\": 0.056623322147651096,\n\
\ \"f1_stderr\": 0.0013885957029727636,\n \"acc\": 0.40100097356766773,\n\
\ \"acc_stderr\": 0.009867271082149756\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415914,\n\
\ \"f1\": 0.056623322147651096,\n \"f1_stderr\": 0.0013885957029727636\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07429871114480667,\n \
\ \"acc_stderr\": 0.007223844172845574\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453937\n\
\ }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/llama-2-7b-hf-guanaco-1k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T19_26_34.289625
path:
- '**/details_harness|drop|3_2023-10-17T19-26-34.289625.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T19-26-34.289625.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T19_26_34.289625
path:
- '**/details_harness|gsm8k|5_2023-10-17T19-26-34.289625.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T19-26-34.289625.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T19_26_34.289625
path:
- '**/details_harness|winogrande|5_2023-10-17T19-26-34.289625.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T19-26-34.289625.parquet'
- config_name: results
data_files:
- split: 2023_10_17T19_26_34.289625
path:
- results_2023-10-17T19-26-34.289625.parquet
- split: latest
path:
- results_2023-10-17T19-26-34.289625.parquet
---
# Dataset Card for Evaluation run of quantumaikr/llama-2-7b-hf-guanaco-1k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/quantumaikr/llama-2-7b-hf-guanaco-1k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [quantumaikr/llama-2-7b-hf-guanaco-1k](https://huggingface.co/quantumaikr/llama-2-7b-hf-guanaco-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__llama-2-7b-hf-guanaco-1k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T19:26:34.289625](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-7b-hf-guanaco-1k/blob/main/results_2023-10-17T19-26-34.289625.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415914,
"f1": 0.056623322147651096,
"f1_stderr": 0.0013885957029727636,
"acc": 0.40100097356766773,
"acc_stderr": 0.009867271082149756
},
"harness|drop|3": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415914,
"f1": 0.056623322147651096,
"f1_stderr": 0.0013885957029727636
},
"harness|gsm8k|5": {
"acc": 0.07429871114480667,
"acc_stderr": 0.007223844172845574
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453937
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
reichenbach/drug_combi_instruct | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: doc_id
dtype: string
- name: sentence
dtype: string
- name: spans
list:
- name: span_id
dtype: int64
- name: text
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: token_start
dtype: int64
- name: token_end
dtype: int64
- name: rels
list:
- name: class
dtype: string
- name: spans
sequence: int64
- name: is_context_needed
dtype: bool
- name: paragraph
dtype: string
- name: source
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 5946054
num_examples: 1362
download_size: 2966437
dataset_size: 5946054
---
# Dataset Card for "drug_combi_instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BirdL/OSD-Dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 7440671071.55
num_examples: 198771
download_size: 7196594621
dataset_size: 7440671071.55
---
# Dataset Card for "OSD-Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
This is a reformat of Huggingface Project's [SD Multiplayer Dataset](https://huggingface.co/datasets/huggingface-projects/sd-multiplayer-data)
It converts the image bucket into a parquet format. The text column is the prompt + the timestamp for it to the minutes precision.
The model finetuned on it is [here](https://huggingface.co/BirdL/OSD-Model) |
Rardilit/Panther-dataset_v1 | ---
license: other
task_categories:
- text-generation
- conversational
- question-answering
- text2text-generation
language:
- en
tags:
- text generation
- panther
pretty_name: Panther
size_categories:
- 100K<n<1M
---
# Dataset Details
This dataset is a modified version of [Anthropic/hh-rlhf](https://huggingface.co/datasets/Anthropic/hh-rlhf)
This dataset is used in fine tuning [Panther](https://huggingface.co/Rardilit/Panther_v1) - an state of the art LLM funtuned on llama-7b pretrained model.
A very small portion i.e. 5.3% of prompts and responses were taken from this dataset to finetune and train [Panther](https://huggingface.co/Rardilit/Panther_v1)
## Dataset Details
### Dataset Structure
### Train
Train rows : 377k
### Validation
Validation rows : 20.3k
### Dataset Format
```python
input : "prompt"
output : "response"
```
## How to Use
```python
from datasets import load_dataset
dataset = load_dataset("Rardilit/Panther-dataset_v1")
``` |
pytorch-survival/support | ---
dataset_info:
features:
- name: x0
dtype: float32
- name: x1
dtype: float32
- name: x2
dtype: float32
- name: x3
dtype: float32
- name: x4
dtype: float32
- name: x5
dtype: float32
- name: x6
dtype: float32
- name: x7
dtype: float32
- name: x8
dtype: float32
- name: x9
dtype: float32
- name: x10
dtype: float32
- name: x11
dtype: float32
- name: x12
dtype: float32
- name: x13
dtype: float32
- name: event_time
dtype: float32
- name: event_indicator
dtype: int32
splits:
- name: train
num_bytes: 567872
num_examples: 8873
download_size: 212217
dataset_size: 567872
---
# Dataset Card for "support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tarta-ai/jobs-in-california-february-2023 | ---
license: other
task_categories:
- text-classification
language:
- en
tags:
- job
- jobs
- california jobs
pretty_name: Comprehensive Job Count Information by Company in California
size_categories:
- 1M<n<10M
--- |
gguichard/myriade_noun_wsd_bis2 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 75431443
num_examples: 124552
download_size: 15044940
dataset_size: 75431443
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "myriade_noun_wsd_bis2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MeilingShi/legal_argument_mining | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
size_categories:
- 1K<n<10K
--- |
MariaK/examples | ---
license: apache-2.0
---
|
Oragani/BoneworksFord | ---
license: afl-3.0
---
|
CyberHarem/dracaena_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of dracaena/ドラセナ (Pokémon)
This is the dataset of dracaena/ドラセナ (Pokémon), containing 62 images and their tags.
The core tags of this character are `long_hair, black_hair, breasts, earrings, mature_female`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 62 | 55.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 62 | 35.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 133 | 66.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 62 | 49.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 133 | 87.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dracaena_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, closed_eyes, necklace, open_mouth, smile, pokemon_(creature), dress, simple_background, solo |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_eyes | necklace | open_mouth | smile | pokemon_(creature) | dress | simple_background | solo |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-----------|:-------------|:--------|:---------------------|:--------|:--------------------|:-------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
gguichard/wsd_myriade_synth_data_gpt4turbo_canonique | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2009582
num_examples: 3391
download_size: 398083
dataset_size: 2009582
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wsd_myriade_synth_data_gpt4turbo_canonique"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_zyh3826__20231206094523-pretrain-Llama-2-13b-hf-76000 | ---
pretty_name: Evaluation run of zyh3826/20231206094523-pretrain-Llama-2-13b-hf-76000
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zyh3826/20231206094523-pretrain-Llama-2-13b-hf-76000](https://huggingface.co/zyh3826/20231206094523-pretrain-Llama-2-13b-hf-76000)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zyh3826__20231206094523-pretrain-Llama-2-13b-hf-76000\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T19:10:08.159006](https://huggingface.co/datasets/open-llm-leaderboard/details_zyh3826__20231206094523-pretrain-Llama-2-13b-hf-76000/blob/main/results_2023-12-16T19-10-08.159006.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24943893194371924,\n\
\ \"acc_stderr\": 0.030400489062706072,\n \"acc_norm\": 0.25014496177092693,\n\
\ \"acc_norm_stderr\": 0.031209015064341802,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156482,\n \"mc2\": 0.4471244819837127,\n\
\ \"mc2_stderr\": 0.014622242508536614\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.27303754266211605,\n \"acc_stderr\": 0.01301933276263575,\n\
\ \"acc_norm\": 0.310580204778157,\n \"acc_norm_stderr\": 0.013522292098053055\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4026090420235013,\n\
\ \"acc_stderr\": 0.0048942100113032235,\n \"acc_norm\": 0.5203146783509262,\n\
\ \"acc_norm_stderr\": 0.0049856612829985835\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.02700876609070809,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.02700876609070809\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\": 0.15,\n\
\ \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462833,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817244,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.02047323317355198,\n\
\ \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.02047323317355198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23119266055045873,\n \"acc_stderr\": 0.01807575024163315,\n \"\
acc_norm\": 0.23119266055045873,\n \"acc_norm_stderr\": 0.01807575024163315\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767478,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767478\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036416,\n\
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036416\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.38565022421524664,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.38565022421524664,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137276,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137276\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.03512385283705051,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.03512385283705051\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2771392081736909,\n\
\ \"acc_stderr\": 0.01600563629412242,\n \"acc_norm\": 0.2771392081736909,\n\
\ \"acc_norm_stderr\": 0.01600563629412242\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n\
\ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.025122637608816643,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.025122637608816643\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.024922001168886338,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.024922001168886338\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290403,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290403\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193113,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193113\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.02540930195322568,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.02540930195322568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156482,\n \"mc2\": 0.4471244819837127,\n\
\ \"mc2_stderr\": 0.014622242508536614\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6124704025256511,\n \"acc_stderr\": 0.01369235463601677\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/zyh3826/20231206094523-pretrain-Llama-2-13b-hf-76000
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|arc:challenge|25_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|gsm8k|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hellaswag|10_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-10-08.159006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T19-10-08.159006.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- '**/details_harness|winogrande|5_2023-12-16T19-10-08.159006.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T19-10-08.159006.parquet'
- config_name: results
data_files:
- split: 2023_12_16T19_10_08.159006
path:
- results_2023-12-16T19-10-08.159006.parquet
- split: latest
path:
- results_2023-12-16T19-10-08.159006.parquet
---
# Dataset Card for Evaluation run of zyh3826/20231206094523-pretrain-Llama-2-13b-hf-76000
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [zyh3826/20231206094523-pretrain-Llama-2-13b-hf-76000](https://huggingface.co/zyh3826/20231206094523-pretrain-Llama-2-13b-hf-76000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zyh3826__20231206094523-pretrain-Llama-2-13b-hf-76000",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T19:10:08.159006](https://huggingface.co/datasets/open-llm-leaderboard/details_zyh3826__20231206094523-pretrain-Llama-2-13b-hf-76000/blob/main/results_2023-12-16T19-10-08.159006.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24943893194371924,
"acc_stderr": 0.030400489062706072,
"acc_norm": 0.25014496177092693,
"acc_norm_stderr": 0.031209015064341802,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156482,
"mc2": 0.4471244819837127,
"mc2_stderr": 0.014622242508536614
},
"harness|arc:challenge|25": {
"acc": 0.27303754266211605,
"acc_stderr": 0.01301933276263575,
"acc_norm": 0.310580204778157,
"acc_norm_stderr": 0.013522292098053055
},
"harness|hellaswag|10": {
"acc": 0.4026090420235013,
"acc_stderr": 0.0048942100113032235,
"acc_norm": 0.5203146783509262,
"acc_norm_stderr": 0.0049856612829985835
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.02700876609070809,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.02700876609070809
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641143,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641143
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462833,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817244,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.02047323317355198,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.02047323317355198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23119266055045873,
"acc_stderr": 0.01807575024163315,
"acc_norm": 0.23119266055045873,
"acc_norm_stderr": 0.01807575024163315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.025416428388767478,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.025416428388767478
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036416,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036416
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.38565022421524664,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.38565022421524664,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.03512385283705051,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.03512385283705051
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2771392081736909,
"acc_stderr": 0.01600563629412242,
"acc_norm": 0.2771392081736909,
"acc_norm_stderr": 0.01600563629412242
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.025122637608816643,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.025122637608816643
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.024922001168886338,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.024922001168886338
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290403,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290403
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193113,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193113
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.02540930195322568,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.02540930195322568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156482,
"mc2": 0.4471244819837127,
"mc2_stderr": 0.014622242508536614
},
"harness|winogrande|5": {
"acc": 0.6124704025256511,
"acc_stderr": 0.01369235463601677
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
harshraj/hinglish_dataset | ---
dataset_info:
features:
- name: User_hinglish
dtype: string
- name: assistant_hinglish
dtype: string
splits:
- name: train
num_bytes: 12625845
num_examples: 10734
download_size: 6674098
dataset_size: 12625845
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/bento | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Ben-to
This is the image base of bangumi Ben-to, we detected 17 characters, 1566 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 208 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 125 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 72 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 411 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 15 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 18 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 42 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 40 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 18 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 139 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 29 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 26 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 18 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 46 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 18 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 183 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 158 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
tasksource/bigbench | ---
annotations_creators:
- crowdsourced
- expert-generated
- machine-generated
language_creators:
- crowdsourced
- expert-generated
- machine-generated
- other
language:
- en
license:
- apache-2.0
multilinguality:
- multilingual
- monolingual
pretty_name: bigbench
size_categories:
- unknown
source_datasets:
- original
task_categories:
- multiple-choice
- question-answering
- text-classification
- text-generation
- zero-shot-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- fact-checking
- acceptability-classification
- intent-classification
- multi-class-classification
- multi-label-classification
- text-scoring
- hate-speech-detection
- language-modeling
---
BIG-Bench but it doesn't require the hellish dependencies (tensorflow, pypi-bigbench, protobuf) of the official version.
```python
dataset = load_dataset("tasksource/bigbench",'movie_recommendation')
```
Code to reproduce:
https://colab.research.google.com/drive/1MKdLdF7oqrSQCeavAcsEnPdI85kD0LzU?usp=sharing
Datasets are capped to 50k examples to keep things light.
I also removed the default split when train was available also to save space, as default=train+val.
```bibtex
@article{srivastava2022beyond,
title={Beyond the imitation game: Quantifying and extrapolating the capabilities of language models},
author={Srivastava, Aarohi and Rastogi, Abhinav and Rao, Abhishek and Shoeb, Abu Awal Md and Abid, Abubakar and Fisch, Adam and Brown, Adam R and Santoro, Adam and Gupta, Aditya and Garriga-Alonso, Adri{\`a} and others},
journal={arXiv preprint arXiv:2206.04615},
year={2022}
}
``` |
arifzanko/donut-dummy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 103705.0
num_examples: 2
- name: validation
num_bytes: 46768.0
num_examples: 1
- name: test
num_bytes: 48489.0
num_examples: 1
download_size: 109961
dataset_size: 198962.0
---
# Dataset Card for "donut-dummy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_freecs__ThetaWave-7B | ---
pretty_name: Evaluation run of freecs/ThetaWave-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [freecs/ThetaWave-7B](https://huggingface.co/freecs/ThetaWave-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freecs__ThetaWave-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T00:35:27.497472](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B/blob/main/results_2024-01-18T00-35-27.497472.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6253257009672182,\n\
\ \"acc_stderr\": 0.03270710367019723,\n \"acc_norm\": 0.6274868169127613,\n\
\ \"acc_norm_stderr\": 0.03336267879887999,\n \"mc1\": 0.5006119951040392,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6525548503294953,\n\
\ \"mc2_stderr\": 0.015542697143559287\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042196,\n\
\ \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729122\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6756622186815375,\n\
\ \"acc_stderr\": 0.00467170170556724,\n \"acc_norm\": 0.8600876319458275,\n\
\ \"acc_norm_stderr\": 0.003461871324067188\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361074,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361074\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6193548387096774,\n\
\ \"acc_stderr\": 0.027621717832907046,\n \"acc_norm\": 0.6193548387096774,\n\
\ \"acc_norm_stderr\": 0.027621717832907046\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010333,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.02636165166838909,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.02636165166838909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.033212448425471275,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.033212448425471275\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49273743016759775,\n\
\ \"acc_stderr\": 0.016720737405179514,\n \"acc_norm\": 0.49273743016759775,\n\
\ \"acc_norm_stderr\": 0.016720737405179514\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757485,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757485\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.01273854737130396,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.01273854737130396\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700033,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700033\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n\
\ \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n\
\ \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5006119951040392,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6525548503294953,\n\
\ \"mc2_stderr\": 0.015542697143559287\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5610310841546626,\n \
\ \"acc_stderr\": 0.013669500369036205\n }\n}\n```"
repo_url: https://huggingface.co/freecs/ThetaWave-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|arc:challenge|25_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|gsm8k|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hellaswag|10_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T00-35-27.497472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T00-35-27.497472.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- '**/details_harness|winogrande|5_2024-01-18T00-35-27.497472.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T00-35-27.497472.parquet'
- config_name: results
data_files:
- split: 2024_01_18T00_35_27.497472
path:
- results_2024-01-18T00-35-27.497472.parquet
- split: latest
path:
- results_2024-01-18T00-35-27.497472.parquet
---
# Dataset Card for Evaluation run of freecs/ThetaWave-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freecs/ThetaWave-7B](https://huggingface.co/freecs/ThetaWave-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freecs__ThetaWave-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T00:35:27.497472](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B/blob/main/results_2024-01-18T00-35-27.497472.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6253257009672182,
"acc_stderr": 0.03270710367019723,
"acc_norm": 0.6274868169127613,
"acc_norm_stderr": 0.03336267879887999,
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6525548503294953,
"mc2_stderr": 0.015542697143559287
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042196,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729122
},
"harness|hellaswag|10": {
"acc": 0.6756622186815375,
"acc_stderr": 0.00467170170556724,
"acc_norm": 0.8600876319458275,
"acc_norm_stderr": 0.003461871324067188
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361074,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361074
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.027621717832907046,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.027621717832907046
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010333,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.033212448425471275,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.033212448425471275
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49273743016759775,
"acc_stderr": 0.016720737405179514,
"acc_norm": 0.49273743016759775,
"acc_norm_stderr": 0.016720737405179514
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.01273854737130396,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.01273854737130396
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700033,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700033
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6525548503294953,
"mc2_stderr": 0.015542697143559287
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.5610310841546626,
"acc_stderr": 0.013669500369036205
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Prasann15479/PII-Dataset | ---
license: apache-2.0
---
This Dataset was created using Gemini api using the kaggle notebook : https://www.kaggle.com/code/newtonbaba12345/pii-detection-data-generation-using-gemini
|
RJCentury/generalScaffolding | ---
license: openrail
language:
- sk
--- |
liuyanchen1015/MULTI_VALUE_mrpc_their_them | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 25363
num_examples: 88
- name: train
num_bytes: 43632
num_examples: 151
- name: validation
num_bytes: 4237
num_examples: 15
download_size: 60221
dataset_size: 73232
---
# Dataset Card for "MULTI_VALUE_mrpc_their_them"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mlpc-lab/YTTB-VQA | ---
task_categories:
- visual-question-answering
language:
- en
pretty_name: YTTB-VQA
size_categories:
- n<1K
license: cc-by-nc-4.0
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:** https://gordonhu608.github.io/bliva/
- **Repository:** https://github.com/mlpc-ucsd/BLIVA.git
- **Paper:**
- **Point of Contact:** w1hu@ucsd.edu
### Dataset Summary
The YTTB-VQA Dataset is a collection of 400 Youtube thumbnail question-answer pairs to evaluate the visual perception abilities of in-text images. It covers 11
categories, including technology, sports, entertainment, food, news, history, music, nature, cars, and education.
### Supported Tasks and Leaderboards
This dataset supports many tasks, including visual question answering, image captioning, etc.
### License
CC-By-NC-4.0
### Languages
The language of the data is primarily English.
## Getting Started
### Creating the dataset
Run the following command to download the images and create the dataset:
```python3 create_dataset.py```
You will find the images in `images_new` and the dataset in `youtube_new.json`.
## Dataset Structure
### Data Instances
A data instance in this dataset represents entries from a collection augmented by human-generated questions submitted to BLIVA. The answer is then entered into the answer field.
### Data Fields
**video_id:** a unique string representing a specific YouTube thumbnail image.<br>
**question:** representing a human-generated question.<br>
**video_classes:** representing a specific category for the YouTube thumbnail image.<br>
**answers:** This represents a ground truth answer for the question made about the YouTube thumbnail image.<br>
**video link** Representing the URL link for each YouTube video.
### Data Splits
The data are unsplit.
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
We randomly selected YouTube videos with text-rich thumbnails from different categories during the data collection.
We recorded the unique video ID for each YouTube video and obtained the high-resolution thumbnail from the
URL ”http://img.youtube.com/vi/YouTube-Video-ID/maxresdefault.jpg”.
### Annotations
#### Annotation process
We created the annotation file with the following fields: ”video id,” question,” video classes,” answers,” and ”video link" in JSON format.
## Considerations for Using the Data
### Discussion of Biases
Although our dataset spans 11 categories, the ratio within each category varies. For example, 18% of the dataset pertains to education, while only 2% is dedicated to news.
### Acknowledgments
The youtube thumbnails dataset is purely for academic research and not for any monetary uses. For any of the authors who saw our dataset and found their thumbnail images used inappropriately, please get in touch with us directly by this email at w1hu@ucsd.edu and we will remove the image immediately. |
CyberHarem/michishio_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of michishio/満潮/满潮 (Azur Lane)
This is the dataset of michishio/満潮/满潮 (Azur Lane), containing 23 images and their tags.
The core tags of this character are `animal_ears, cat_ears, bangs, animal_ear_fluff, breasts, brown_hair, long_hair, ahoge, brown_eyes, braid, cat_girl, large_breasts, hair_between_eyes, cat_tail, medium_breasts, ribbon, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 24.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 17.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 51 | 34.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 23.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 51 | 44.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/michishio_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, balloon, detached_sleeves, looking_at_viewer, open_mouth, solo, :d, pink_dress, puffy_short_sleeves, frills, full_body, high_heels, pink_footwear, white_background, white_thighhighs, bare_shoulders, blush, bow, cleavage_cutout, hair_rings, jingle_bell, petals, simple_background, standing_on_one_leg, tiara, very_long_hair, virtual_youtuber |
| 1 | 15 |  |  |  |  |  | blush, jingle_bell, :d, neck_bell, open_mouth, kimono, long_sleeves, looking_at_viewer, red_skirt, 2girls, bare_shoulders, wide_sleeves, off_shoulder, pleated_skirt, white_shirt, sailor_collar, simple_background, white_background, holding, red_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | balloon | detached_sleeves | looking_at_viewer | open_mouth | solo | :d | pink_dress | puffy_short_sleeves | frills | full_body | high_heels | pink_footwear | white_background | white_thighhighs | bare_shoulders | blush | bow | cleavage_cutout | hair_rings | jingle_bell | petals | simple_background | standing_on_one_leg | tiara | very_long_hair | virtual_youtuber | neck_bell | kimono | long_sleeves | red_skirt | 2girls | wide_sleeves | off_shoulder | pleated_skirt | white_shirt | sailor_collar | holding | red_ribbon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------------------|:--------------------|:-------------|:-------|:-----|:-------------|:----------------------|:---------|:------------|:-------------|:----------------|:-------------------|:-------------------|:-----------------|:--------|:------|:------------------|:-------------|:--------------|:---------|:--------------------|:----------------------|:--------|:-----------------|:-------------------|:------------|:---------|:---------------|:------------|:---------|:---------------|:---------------|:----------------|:--------------|:----------------|:----------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | | | | X | X | | X | | | | | | | X | | X | X | | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
venetis/twitter_us_airlines_kaggle | ---
license: afl-3.0
---
Dataset link:
https://www.kaggle.com/datasets/crowdflower/twitter-airline-sentiment?sort=most-comments |
luden/images | ---
license: other
---
|
senhorsapo/gui | ---
license: openrail
---
|
santoshtyss/canadian_legislation | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 185323320
num_examples: 5000
- name: validation
num_bytes: 9358169
num_examples: 500
download_size: 67958483
dataset_size: 194681489
---
# Dataset Card for "canadian_legislation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/haiku_dpo | ---
license: cc-by-4.0
size_categories:
- 1K<n<10K
task_categories:
- text-generation
- reinforcement-learning
pretty_name: Haiku DPO
dataset_info:
- config_name: aesthetic-preference
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
splits:
- name: train
num_bytes: 3090146
num_examples: 1500
download_size: 518656
dataset_size: 3090146
- config_name: default
features:
- name: question
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: generations
sequence: string
- name: scores
sequence: int64
- name: chosen
dtype: string
- name: chosen_score
dtype: int64
- name: rejected
dtype: string
- name: rejected_score
dtype: int64
- name: tie
dtype: bool
- name: difference_in_score
dtype: int64
- name: system
dtype: string
splits:
- name: train
num_bytes: 45631767
num_examples: 4123
download_size: 3632867
dataset_size: 45631767
- config_name: raw
features:
- name: prompt
dtype: string
- name: responses
sequence: string
- name: scores
sequence: int64
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: tie
dtype: bool
- name: difference_in_score
dtype: int64
splits:
- name: train
num_bytes: 5462
num_examples: 10
download_size: 9198
dataset_size: 5462
- config_name: raw-haikus
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
splits:
- name: train
num_bytes: 52003027
num_examples: 4303
download_size: 6328873
dataset_size: 52003027
- config_name: raw-scored-haikus
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: generations
sequence: string
- name: scores
sequence: int64
splits:
- name: train
num_bytes: 26255574
num_examples: 3220
download_size: 1986498
dataset_size: 26255574
- config_name: rule_ranked
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: generations
sequence: string
- name: scores
sequence: int64
- name: chosen
dtype: string
- name: chosen_score
dtype: int64
- name: rejected
dtype: string
- name: rejected_score
dtype: int64
- name: tie
dtype: bool
- name: difference_in_score
dtype: int64
splits:
- name: train
num_bytes: 46515868
num_examples: 4302
download_size: 3772778
dataset_size: 46515868
configs:
- config_name: aesthetic-preference
data_files:
- split: train
path: aesthetic-preference/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: raw
data_files:
- split: train
path: raw/train-*
- config_name: raw-haikus
data_files:
- split: train
path: raw-haikus/train-*
- config_name: raw-scored-haikus
data_files:
- split: train
path: raw-scored-haikus/train-*
- config_name: raw_prompts
data_files:
- split: train
path: raw_prompts/train-*
- config_name: rule_ranked
data_files:
- split: train
path: rule_ranked/train-*
tags:
- dpo
- poetry
- synthetic
- distilabel
---
---
<h1 align="center">🌸 Haiku DPO 🌸</h1>
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/60107b385ac3e86b3ea4fc34/veyblgmspfou3f3SgZxwX.png" alt="Your Image" width="500">
</p>
<p align="center"><em>In data, words flow,<br>
Teaching AI the art of<br>
Haiku, line by line.
</em></p>
# Dataset Card for Haiku DPO
[<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-dark.png" alt="Built with Distilabel" width="200" height="32"/>](https://github.com/argilla-io/distilabel)
<!-- Provide a quick summary of the dataset. -->
This a synthetic dataset of haikus. The dataset is constructed with the goal of helping to train LLMs to be more 'technically' competent at writing haikus.
## Dataset Details
The data consists of a few different components that are described in more detail below but the key components are:
- a column of synthetically generated user prompts requesting a haiku
- a column consisting of multiple responses to this prompt, generated by a language model
- a column consisting of scores for each of these responses, generated by a rule-based system
The goal of this dataset was to help the author explore the process of synthesizing a dataset for DPO and to explore the extent to which DPO can be used to capture aesthetic preferences in language generation.
Haiku also has the nice property of being relatively easy to score on a 'technical basis' i.e. do they follow the 5-7-5 syllable structure? As a result of this property, some relatively simple Python functions can be used to rate the technical quality of a haiku.
By focusing on a narrower task, this dataset also intends to offer a place to explore questions such as:
- should DPO datasets prioritize a large gap in scores between the 'best' and 'worst' generations?
- Is more data better or is a bigger gap in scores better?
I am also interested in exploring the extent to which smaller models can learn to perform well at a narrower task. Again, haiku writing here is a good candidate for this exploration as it is relatively narrow, the data is cheaper to generate and it is relatively easy to score on a technical basis so you don't need to rely on human annotation or a "judge" LM to score the generations.
### Dataset Description
- **Curated by:** Daniel van Strien
- **Language(s) (NLP):** English (synthetically generated)
- **License:** Creative Commons Attribution 4.0 International License
## Uses
This dataset can be used "as is" to help train LLMs to be more 'technically' competent at writing haikus. However, it is also intended as a "test bed" for exploring how different DPO qualities of a DPO dataset impact models trained on these datasets.
### Direct Use
The `default` config can be used for training DPO models. The "chosen" and "rejected" columns contain the highest-quality and lowest-quality generations respectively. You may, however, want to filter the dataset in other ways to explore how different qualities of a DPO dataset impact the resulting model.
### Out-of-Scope Use
This dataset was constructed with a rather narrow goal in mind. It is unlikely to be useful for other tasks. However, it may be useful as a test bed for exploring how different qualities of a DPO dataset impact the resulting model.
## Dataset Structure
The dataset consists of a few different configurations:
- `default`: this is likely to be the most useful one for most users. It contains the highest-quality and lowest-quality generations in the "chosen" and "rejected" columns respectively. It also contains the "difference_in_score" column which is the difference between the score of the highest-quality generation and the lowest-quality generation. This column can be used to filter the dataset to explore how different qualities of a DPO dataset impact the resulting model.
The `default` configuration has the following columns:
- 'question': the prompt requesting a haiku
- 'generation_model': the name of the model used to generate the haiku
- 'generation_prompt': the full prompt used to generate the haiku
- 'generations': the haikus generated by the model
- 'scores': the scores for each of the haikus
- 'chosen': the highest-quality haiku
- 'chosen_score': the score for the highest-quality haiku
- 'rejected': the lowest-quality haiku
- 'rejected_score': the score for the lowest-quality haiku
- 'tie': whether the highest-quality and lowest-quality haikus have the same score
- 'difference_in_score': the difference between the score of the highest-quality generation and the lowest-quality generation
- 'system': the system prompt used during generation
The `default` configuration removes ties and ensures the lowest quality generation has a score < below 3. More information on the scoring process is outlined below.
The `rule_ranked` configuration is similar to the `default` configuration but it has not been filtered at all so will give you more scope for things like including ties in your dataset.
## Dataset Creation
This dataset was generated using the [distilabel](https://github.com/argilla-io/distilabel) library using [teknium](https://huggingface.co/teknium)'s [OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) model. The prompts were generated from a seed list of terms and an adapted version of the [SELF-INSTRUCT](https://arxiv.org/abs/2212.10560) papers prompting strategy. You can see more details about the process of generating these prompts in the associated dataset [davanstrien/haiku_prompts](https://huggingface.co/datasets/davanstrien/haiku_prompts).
From these initial prompts, multiple generations of haiku were generated (again using teknium's OpenHermes-2.5-Mistral-7B model). These generations were then scored using a rule-based system. This rule system scored haikus out of a 4, with the following approach to scoring:
If the haiku is not three lines it scores zero. Then for each line, 1 point is deducted if the line does not match the expected syllable count for that line. This means a haiku with three lines matching the traditional 5-7-5 syllable structure will score 4. A haiku with one line with an incorrect syllable count will score 3.
The rule-based system is not perfect and there are some cases where it will incorrectly score a haiku. However, it is relatively easy to understand and it is relatively easy to score a haiku manually so it is a good candidate for a rule-based system. The code for this is shared
in this [GitHub repository](https://github.com/davanstrien/haiku-dpo).
### Curation Rationale
The dataset was curated with the following goals in mind:
- to explore the process of using open models to generate synthetic datasets
- to explore the use of rules for ranking generations
- to explore how different slices of a DPO dataset impact the resulting model
### Source Data
#### Data Collection and Processing
See above for the process of generating the data.
#### Who are the source data producers?
Almost all of the data is synthetic. The prompts were generated using a seed list of terms and an adapted version of the [SELF-INSTRUCT](https://arxiv.org/abs/2212.10560) papers prompting strategy. The generations were generated using teknium's OpenHermes-2.5-Mistral-7B model. The scores were generated using a rule-based system. The initial prompt seed terms were generated by Daniel van Strien with some help from GPT-4.
### Annotations
There are no traditional annotations in this dataset. However, the scores are generated using a rule-based system.
#### Personal and Sensitive Information
It is very unlikely that this dataset contains any personal or sensitive information, but if you find any prompts that you believe to be harmful, please open a discussion and I will remove them from the dataset.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Whilst I have not found any harmful prompts in the dataset, I have not manually validated all of the prompts. If you find any prompts which you believe to be harmful, please open a discussion and I will remove them from the dataset.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
The original seed prompts used to generate this dataset are by no means comprehensive, and the dataset is likely to be biased toward the topics covered by the seed prompts. This dataset will likely develop over time. If you have any suggestions for additional seed prompts, please open a discussion and I will add them to the dataset.
## Citation [optional]
I have zero expectation that this dataset will be cited, but if you do use it in your work, you can cite it as follows:
**BibTeX:**
```bibtex
@misc{vanstrien2021haiku,
title={Haiku DPO},
author={{van Strien}, Daniel},
year={2024},
eprint={2110.00482},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/datasets/davanstrien/haiku_dpo}}
}
```
## Glossary
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
- DPO/Direct Preference Optimization: Introduced in [*Direct Preference Optimization: Your Language Model is Secretly a Reward Model*](https://huggingface.co/papers/2305.18290)
- SELF-INSTRUCT: A prompting strategy introduced in [*Self-Instruct: Aligning Language Model with Self Generated Instructions*](https://huggingface.co/papers/2212.10560)
## Dataset Card Authors
[davanstrien](https://huggingface.co/davanstrien)
## Dataset Card Contact
[davanstrien](https://huggingface.co/davanstrien) |
liyucheng/allsides | ---
dataset_info:
features:
- name: title
dtype: string
- name: url
dtype: string
- name: topic
dtype: string
- name: camp
dtype: string
- name: full_stories
dtype: string
- name: articles
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4499065
num_examples: 987
download_size: 2363071
dataset_size: 4499065
---
# Dataset Card for "allsides"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AsphyXIA/baarat-kan-en-dataset-0.1 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 908044345
num_examples: 4093524
download_size: 485798531
dataset_size: 908044345
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vetertann/promease_chat | ---
license: mit
language:
- ru
pretty_name: vp_train
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Vuno/Sk | ---
license: apache-2.0
---
|
GiovanniHD/AMI | ---
license: openrail
---
|
CyberHarem/fukuyama_mai_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fukuyama_mai/福山舞 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of fukuyama_mai/福山舞 (THE iDOLM@STER: Cinderella Girls), containing 131 images and their tags.
The core tags of this character are `black_hair, ponytail, long_hair, black_eyes, bangs, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 131 | 119.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fukuyama_mai_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 131 | 85.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fukuyama_mai_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 298 | 170.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fukuyama_mai_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 131 | 110.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fukuyama_mai_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 298 | 214.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fukuyama_mai_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fukuyama_mai_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, open_mouth, solo, :d, long_sleeves, looking_at_viewer, blue_dress, blush, hair_bow, randoseru, simple_background, white_background, crime_prevention_buzzer, female_child, full_body, holding_strap, pink_shirt, ribbon, shoes, socks |
| 1 | 13 |  |  |  |  |  | 1girl, solo, hair_bow, looking_at_viewer, open_mouth, blush, white_background, :d, sleeveless, white_gloves, red_dress, ribbon |
| 2 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, blush, shirt, simple_background, white_background, hair_bow, upper_body, dated, ribbon, signature |
| 3 | 5 |  |  |  |  |  | 1girl, open_mouth, blush, looking_at_viewer, miniskirt, red_skirt, scrunchie, simple_background, solo, white_background, :d, brown_eyes, plaid_skirt, from_behind, hair_ornament, long_sleeves, looking_back, pleated_skirt, turtleneck_sweater, white_sweater |
| 4 | 6 |  |  |  |  |  | 1girl, navel, solo, flat_chest, blush, loli, open_mouth, smile, groin, pink_bikini, scrunchie |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | solo | :d | long_sleeves | looking_at_viewer | blue_dress | blush | hair_bow | randoseru | simple_background | white_background | crime_prevention_buzzer | female_child | full_body | holding_strap | pink_shirt | ribbon | shoes | socks | sleeveless | white_gloves | red_dress | smile | shirt | upper_body | dated | signature | miniskirt | red_skirt | scrunchie | brown_eyes | plaid_skirt | from_behind | hair_ornament | looking_back | pleated_skirt | turtleneck_sweater | white_sweater | navel | flat_chest | loli | groin | pink_bikini |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-------|:-----|:---------------|:--------------------|:-------------|:--------|:-----------|:------------|:--------------------|:-------------------|:--------------------------|:---------------|:------------|:----------------|:-------------|:---------|:--------|:--------|:-------------|:---------------|:------------|:--------|:--------|:-------------|:--------|:------------|:------------|:------------|:------------|:-------------|:--------------|:--------------|:----------------|:---------------|:----------------|:---------------------|:----------------|:--------|:-------------|:-------|:--------|:--------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | | X | | X | X | | | X | | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | | | X | | X | X | | X | X | | | | | | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | X | X | X | X | X |
|
BramVanroy/wiki_simplifications_dutch_dedup_split | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: result
dtype: string
splits:
- name: train
num_bytes: 1973422943.5509233
num_examples: 2754760
- name: validation
num_bytes: 5868489.724538313
num_examples: 8192
- name: test
num_bytes: 5868489.724538313
num_examples: 8192
download_size: 1289141718
dataset_size: 1985159923.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
This is a variant of [the original dataset](https://huggingface.co/datasets/UWV/Leesplank_NL_wikipedia_simplifications).
- It was shuffled (seed=42);
- Deduplicated on rows (96,613 rows removed);
- Split into train, validation and test sets (the latter have 8192 samples each)
## Reproduction
```python
from datasets import load_dataset, Dataset, DatasetDict
ds = load_dataset("UWV/Leesplank_NL_wikipedia_simplifications", split="train")
ds = ds.shuffle(seed=42)
print("original", ds)
df = ds.to_pandas()
df = df.drop_duplicates().reset_index()
ds = Dataset.from_pandas(df)
print("dedupe", ds)
ds = ds.select_columns(["prompt", "result"])
test_split = ds.train_test_split(test_size=8192)
valid_split = test_split["train"].train_test_split(test_size=8192)
final = DatasetDict({
"train": valid_split["train"],
"validation": valid_split["test"],
"test": test_split["test"]
})
print(final)
final.push_to_hub("BramVanroy/wiki_simplifications_dutch_dedup_split")
```
|
Nart/abkhaz_text | ---
language_creators:
- expert-generated
language:
- ab
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: Abkhaz monolingual corpus
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Dataset Card for "Abkhaz text"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Other Known Limitations](#other-known-limitations)
## Dataset Description
- **Point of Contact:** [Nart Tlisha](mailto:daniel.abzakh@gmail.com)
- **Size of the generated dataset:** 176 MB
### Dataset Summary
The Abkhaz language monolingual dataset is a collection of 1,470,480 sentences extracted from different sources. The dataset is available under the Creative Commons Universal Public Domain License. Part of it is also available as part of [Common Voice](https://commonvoice.mozilla.org/ab), another part is from the [Abkhaz National Corpus](https://clarino.uib.no/abnc)
## Dataset Creation
### Source Data
Here is a link to the source of a large part of the data on [github](https://github.com/danielinux7/Multilingual-Parallel-Corpus/blob/master/ebooks/reference.md)
## Considerations for Using the Data
### Other Known Limitations
The accuracy of the dataset is around 95% (gramatical, arthographical errors)
|
kheopss/instructed_humorous_tone_dataset | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
- name: system
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 599838
num_examples: 114
download_size: 357844
dataset_size: 599838
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
quyanh/cot-large | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 14568291.0
num_examples: 35873
download_size: 8626487
dataset_size: 14568291.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cot-large"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thamognya/ALotNLI | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- agpl-3.0
multilinguality:
- monolingual
pretty_name: A Lot of NLI
size_categories:
- 100K<n<1M
source_datasets:
- snli
- multi_nli
- anli
task_categories:
- text-classification
task_ids:
- natural-language-inference
viewer: true
---
# Repo
Github Repo: [thamognya/TBertNLI](https://github.com/thamognya/TBertNLI) specifically in the [src/data directory](https://github.com/thamognya/TBertNLI/tree/master/src/data).
# Sample
``` premise hypothesis label
0 this church choir sings to the masses as they ... the church is filled with song 0
1 this church choir sings to the masses as they ... a choir singing at a baseball game 2
2 a woman with a green headscarf blue shirt and ... the woman is young 1
3 a woman with a green headscarf blue shirt and ... the woman is very happy 0
4 a woman with a green headscarf blue shirt and ... the woman has been shot 2
```
# Datsets Origin
As of now the marked datasets have been used to make this dataset and the other ones are todo
- [x] SNLI
- [x] MultiNLI
- SuperGLUE
- FEVER
- WIKI-FACTCHECK
- [x] ANLI
- more from huggingface
# Reasons
Just for finetuning of NLI models and purely made for NLI (not zero shot classification)
|
jose-h-solorzano/synth-forgetting-generalization-7 | ---
dataset_info:
features:
- name: input
sequence: float64
- name: output
sequence: float64
splits:
- name: train
num_bytes: 16320000.0
num_examples: 40000
- name: test
num_bytes: 4080000.0
num_examples: 10000
download_size: 14132621
dataset_size: 20400000.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
rjindal/Jindal_cnn | ---
dataset_info:
features:
- name: article
dtype: string
- name: highlights
dtype: string
splits:
- name: train
num_bytes: 77368091
num_examples: 10000
download_size: 46894336
dataset_size: 77368091
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
romain22222/pokemon-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 166725933.125
num_examples: 1271
download_size: 163282284
dataset_size: 166725933.125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped | ---
pretty_name: Evaluation run of EleutherAI/pythia-6.9b-deduped
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EleutherAI/pythia-6.9b-deduped](https://huggingface.co/EleutherAI/pythia-6.9b-deduped)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T01:47:10.144336](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped/blob/main/results_2023-10-22T01-47-10.144336.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.0002773614457335642,\n \"f1\": 0.04495805369127533,\n\
\ \"f1_stderr\": 0.0011424943224633687,\n \"acc\": 0.32878164020122397,\n\
\ \"acc_stderr\": 0.008505355545421337\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335642,\n\
\ \"f1\": 0.04495805369127533,\n \"f1_stderr\": 0.0011424943224633687\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \
\ \"acc_stderr\": 0.003527595888722438\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6408839779005525,\n \"acc_stderr\": 0.013483115202120236\n\
\ }\n}\n```"
repo_url: https://huggingface.co/EleutherAI/pythia-6.9b-deduped
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T01_47_10.144336
path:
- '**/details_harness|drop|3_2023-10-22T01-47-10.144336.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T01-47-10.144336.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T01_47_10.144336
path:
- '**/details_harness|gsm8k|5_2023-10-22T01-47-10.144336.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T01-47-10.144336.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:40:55.095296.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:40:55.095296.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:40:55.095296.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T01_47_10.144336
path:
- '**/details_harness|winogrande|5_2023-10-22T01-47-10.144336.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T01-47-10.144336.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_40_55.095296
path:
- results_2023-07-19T17:40:55.095296.parquet
- split: 2023_10_22T01_47_10.144336
path:
- results_2023-10-22T01-47-10.144336.parquet
- split: latest
path:
- results_2023-10-22T01-47-10.144336.parquet
---
# Dataset Card for Evaluation run of EleutherAI/pythia-6.9b-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-6.9b-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-6.9b-deduped](https://huggingface.co/EleutherAI/pythia-6.9b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T01:47:10.144336](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped/blob/main/results_2023-10-22T01-47-10.144336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335642,
"f1": 0.04495805369127533,
"f1_stderr": 0.0011424943224633687,
"acc": 0.32878164020122397,
"acc_stderr": 0.008505355545421337
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335642,
"f1": 0.04495805369127533,
"f1_stderr": 0.0011424943224633687
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.003527595888722438
},
"harness|winogrande|5": {
"acc": 0.6408839779005525,
"acc_stderr": 0.013483115202120236
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_qqp_volition_changes | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 138997
num_examples: 648
- name: test
num_bytes: 1458464
num_examples: 6929
- name: train
num_bytes: 1267368
num_examples: 5789
download_size: 1724303
dataset_size: 2864829
---
# Dataset Card for "MULTI_VALUE_qqp_volition_changes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v3 | ---
pretty_name: Evaluation run of ChavyvAkvar/habib-DPO-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChavyvAkvar/habib-DPO-v3](https://huggingface.co/ChavyvAkvar/habib-DPO-v3) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T11:47:23.423130](https://huggingface.co/datasets/open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v3/blob/main/results_2024-04-15T11-47-23.423130.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6453295376739975,\n\
\ \"acc_stderr\": 0.032216064083720804,\n \"acc_norm\": 0.6461331017721568,\n\
\ \"acc_norm_stderr\": 0.032867317135588374,\n \"mc1\": 0.4834761321909425,\n\
\ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.6520209105654754,\n\
\ \"mc2_stderr\": 0.015462684100611999\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497723,\n\
\ \"acc_norm\": 0.6885665529010239,\n \"acc_norm_stderr\": 0.013532472099850937\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6899024098785103,\n\
\ \"acc_stderr\": 0.004615880352799734,\n \"acc_norm\": 0.8665604461262697,\n\
\ \"acc_norm_stderr\": 0.0033935420742276503\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.038035102483515854,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.038035102483515854\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.02749566368372406,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.02749566368372406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926913,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926913\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973138,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973138\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.0238680032625001,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.0238680032625001\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n\
\ \"acc_stderr\": 0.016639615236845814,\n \"acc_norm\": 0.45027932960893857,\n\
\ \"acc_norm_stderr\": 0.016639615236845814\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.012729785386598559,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.012729785386598559\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553697,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553697\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4834761321909425,\n\
\ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.6520209105654754,\n\
\ \"mc2_stderr\": 0.015462684100611999\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235807\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6565579984836998,\n \
\ \"acc_stderr\": 0.013079933811800308\n }\n}\n```"
repo_url: https://huggingface.co/ChavyvAkvar/habib-DPO-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|arc:challenge|25_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|gsm8k|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hellaswag|10_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-47-23.423130.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T11-47-23.423130.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- '**/details_harness|winogrande|5_2024-04-15T11-47-23.423130.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T11-47-23.423130.parquet'
- config_name: results
data_files:
- split: 2024_04_15T11_47_23.423130
path:
- results_2024-04-15T11-47-23.423130.parquet
- split: latest
path:
- results_2024-04-15T11-47-23.423130.parquet
---
# Dataset Card for Evaluation run of ChavyvAkvar/habib-DPO-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChavyvAkvar/habib-DPO-v3](https://huggingface.co/ChavyvAkvar/habib-DPO-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T11:47:23.423130](https://huggingface.co/datasets/open-llm-leaderboard/details_ChavyvAkvar__habib-DPO-v3/blob/main/results_2024-04-15T11-47-23.423130.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6453295376739975,
"acc_stderr": 0.032216064083720804,
"acc_norm": 0.6461331017721568,
"acc_norm_stderr": 0.032867317135588374,
"mc1": 0.4834761321909425,
"mc1_stderr": 0.017493940190057723,
"mc2": 0.6520209105654754,
"mc2_stderr": 0.015462684100611999
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.013855831287497723,
"acc_norm": 0.6885665529010239,
"acc_norm_stderr": 0.013532472099850937
},
"harness|hellaswag|10": {
"acc": 0.6899024098785103,
"acc_stderr": 0.004615880352799734,
"acc_norm": 0.8665604461262697,
"acc_norm_stderr": 0.0033935420742276503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.038035102483515854,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.038035102483515854
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.02749566368372406,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.02749566368372406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512625,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512625
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926913,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926913
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973138,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.0238680032625001,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.0238680032625001
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45027932960893857,
"acc_stderr": 0.016639615236845814,
"acc_norm": 0.45027932960893857,
"acc_norm_stderr": 0.016639615236845814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598559,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598559
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553697,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553697
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4834761321909425,
"mc1_stderr": 0.017493940190057723,
"mc2": 0.6520209105654754,
"mc2_stderr": 0.015462684100611999
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235807
},
"harness|gsm8k|5": {
"acc": 0.6565579984836998,
"acc_stderr": 0.013079933811800308
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
amruta333/text_classification | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
size_categories:
- 1K<n<10K
--- |
wanadzhar913/crawl-techrakyat | ---
license: apache-2.0
language:
- en
---
* website: [techrakyat](https://techrakyat.com/)
* num. of webpages scraped: 220
* contributed to: https://github.com/huseinzol05/malaysian-dataset |
autoevaluate/autoeval-staging-eval-project-e1907042-7494831 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- clinc_oos
eval_info:
task: multi_class_classification
model: Omar95farag/distilbert-base-uncased-distilled-clinc
metrics: []
dataset_name: clinc_oos
dataset_config: small
dataset_split: test
col_mapping:
text: text
target: intent
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: Omar95farag/distilbert-base-uncased-distilled-clinc
* Dataset: clinc_oos
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
BambiMC/ts_test_2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 110880
num_examples: 576
download_size: 2240
dataset_size: 110880
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
reach-vb/mls-eng-10k-repunct-test | ---
dataset_info:
features:
- name: original_path
dtype: string
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: transcript
dtype: string
- name: audio_duration
dtype: float64
- name: speaker_id
dtype: string
- name: book_id
dtype: string
- name: repunct_text
dtype: string
splits:
- name: dev
num_bytes: 2182237
num_examples: 3807
download_size: 1213838
dataset_size: 2182237
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
---
|
fetchai/citizen_kb_qa_v2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8474622
num_examples: 3224
download_size: 1121546
dataset_size: 8474622
---
# Dataset Card for "citizen_kb_qa_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/beir_dbpedia-entity_dev | ---
pretty_name: '`beir/dbpedia-entity/dev`'
viewer: false
source_datasets: ['irds/beir_dbpedia-entity']
task_categories:
- text-retrieval
---
# Dataset Card for `beir/dbpedia-entity/dev`
The `beir/dbpedia-entity/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/dbpedia-entity/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=67
- `qrels`: (relevance assessments); count=5,673
- For `docs`, use [`irds/beir_dbpedia-entity`](https://huggingface.co/datasets/irds/beir_dbpedia-entity)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/beir_dbpedia-entity_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/beir_dbpedia-entity_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Hasibi2017DBpediaEntityVA,
title={DBpedia-Entity v2: A Test Collection for Entity Search},
author={Faegheh Hasibi and Fedor Nikolaev and Chenyan Xiong and K. Balog and S. E. Bratsberg and Alexander Kotov and J. Callan},
journal={Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval},
year={2017}
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
arianhosseini/summ_dpo1b1_ngen10_minmax | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 36014136
num_examples: 20000
download_size: 21820054
dataset_size: 36014136
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
haml/halml | ---
license: apache-2.0
---
|
heliosprime/twitter_dataset_1713054583 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11478
num_examples: 26
download_size: 8713
dataset_size: 11478
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713054583"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Plona/claims_update1 | ---
configs:
- config_name: default
data_files:
- split: train
path: "20230919 Manju_train.csv"
- split: test
path: "20230919 Manju_test.csv"
- split: origin
path: "20230919 Manju.csv"
--- |
autoevaluate/autoeval-staging-eval-project-xsum-3c39b441-10285368 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: google/pegasus-xsum
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-xsum
* Dataset: xsum
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sheikmohdimran](https://huggingface.co/sheikmohdimran) for evaluating this model. |
lnutiu/alpaca-top | ---
license: openrail
task_categories:
- conversational
- text-generation
language:
- en
size_categories:
- n<1K
--- |
pruthvireddy/Mining_rules | ---
license: mit
---
|
Oikawakaki/Ming-Landsape-Painting | ---
tags:
- art
--- |
VivekNaga/sampledata | ---
license: apache-2.0
---
|
stiffmeister923/Building_Computer_Guide | ---
license: ecl-2.0
---
|
open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped | ---
pretty_name: Evaluation run of EleutherAI/pythia-160m-deduped
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EleutherAI/pythia-160m-deduped](https://huggingface.co/EleutherAI/pythia-160m-deduped)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T14:10:15.721061](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped/blob/main/results_2023-10-18T14-10-15.721061.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003145973154362416,\n\
\ \"em_stderr\": 0.0005734993648436387,\n \"f1\": 0.033831795302013495,\n\
\ \"f1_stderr\": 0.0011064778180343976,\n \"acc\": 0.2580433025186501,\n\
\ \"acc_stderr\": 0.007679640365653923\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436387,\n\
\ \"f1\": 0.033831795302013495,\n \"f1_stderr\": 0.0011064778180343976\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674233\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5138121546961326,\n \"acc_stderr\": 0.014047122916440422\n\
\ }\n}\n```"
repo_url: https://huggingface.co/EleutherAI/pythia-160m-deduped
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T14_10_15.721061
path:
- '**/details_harness|drop|3_2023-10-18T14-10-15.721061.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T14-10-15.721061.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T14_10_15.721061
path:
- '**/details_harness|gsm8k|5_2023-10-18T14-10-15.721061.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T14-10-15.721061.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:37.454131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:01:37.454131.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:01:37.454131.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T14_10_15.721061
path:
- '**/details_harness|winogrande|5_2023-10-18T14-10-15.721061.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T14-10-15.721061.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_01_37.454131
path:
- results_2023-07-19T14:01:37.454131.parquet
- split: 2023_10_18T14_10_15.721061
path:
- results_2023-10-18T14-10-15.721061.parquet
- split: latest
path:
- results_2023-10-18T14-10-15.721061.parquet
---
# Dataset Card for Evaluation run of EleutherAI/pythia-160m-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-160m-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-160m-deduped](https://huggingface.co/EleutherAI/pythia-160m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T14:10:15.721061](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped/blob/main/results_2023-10-18T14-10-15.721061.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436387,
"f1": 0.033831795302013495,
"f1_stderr": 0.0011064778180343976,
"acc": 0.2580433025186501,
"acc_stderr": 0.007679640365653923
},
"harness|drop|3": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436387,
"f1": 0.033831795302013495,
"f1_stderr": 0.0011064778180343976
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674233
},
"harness|winogrande|5": {
"acc": 0.5138121546961326,
"acc_stderr": 0.014047122916440422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gasp/french_rap_songs | ---
license: mit
---
|
kheopss/large_dataset_alpaca_1k_to_hermes | ---
dataset_info:
features:
- name: json_input
dtype: string
- name: titre
dtype: string
- name: prompt0
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 74261807
num_examples: 10840
download_size: 23963068
dataset_size: 74261807
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.