datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
WinterSchool/ISIC35TO45 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: conversation
struct:
- name: data
list:
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 23803760
num_examples: 500
download_size: 23432954
dataset_size: 23803760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
language:
- en
tags:
- medical
pretty_name: skin cancer conversational dataset
size_categories:
- n<1K
---
This dataset is made using raw data from [ISIC(The International Skin Imaging Collaboration)](https://www.isic-archive.com/), an academia and industry partnership designed to use digital skin imaging to help reduce skin cancer mortality.
For each image in the original raw dataset we used the associated metadata to generate a simulated conversation about the image between a user and a chatbot. |
TingChen-ppmc/Tianjin_Dialect_Conversational_Speech_Corpus | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: gender
dtype: string
- name: speaker_id
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 384247410.342
num_examples: 5166
download_size: 370228314
dataset_size: 384247410.342
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Corpus
This dataset is built from Magicdata [ASR-CTJDIACSC: A CHINESE TIANJIN DIALECT CONVERSATIONAL SPEECH CORPUS](https://magichub.com/datasets/tianjin-dialect-conversational-speech-corpus-2/)
This corpus is licensed under a [Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License](http://creativecommons.org/licenses/by-nc-nd/4.0/). Please refer to the license for further information.
Modifications: The audio is split in sentences based on the time span on the transcription file. Sentences that span less than 1 second is discarded. Topics of conversation is removed.
# Usage
To load this dataset, use
```python
from datasets import load_dataset
dialect_corpus = load_dataset("TingChen-ppmc/Tianjin_Dialect_Conversational_Speech_Corpus")
```
This dataset only has train split. To split out a test split, use
```python
from datasets import load_dataset
train_split = load_dataset("TingChen-ppmc/Tianjin_Dialect_Conversational_Speech_Corpus", split="train")
# where test=0.5 denotes 0.5 of the dataset will be split to test split
corpus = train_split.train_test_split(test=0.5)
```
A sample data would be
```python
# note this data is from the Nanchang Dialect corpus, the data format is shared
{'audio':
{'path': 'A0001_S001_0_G0001_0.WAV',
'array': array([-0.00030518, -0.00039673,
-0.00036621, ..., -0.00064087,
-0.00015259, -0.00042725]),
'sampling_rate': 16000},
'gender': '女',
'speaker_id': 'G0001',
'transcription': '北京爱数智慧语音采集'
}
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sugumi_kanagata_alicegearaegisexpansion | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Sugumi Kanagata
This is the dataset of Sugumi Kanagata, containing 30 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 30 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 71 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 84 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 30 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 30 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 30 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 71 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 71 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 61 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 84 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 84 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
benayas/banking_llm_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 21973863
num_examples: 10003
- name: test
num_bytes: 6745410
num_examples: 3080
download_size: 2573578
dataset_size: 28719273
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
KunalEsM/bank_complaint_intent_classifier | ---
license: apache-2.0
---
|
ylacombe/google-tamil | ---
dataset_info:
- config_name: female
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 1364555763.88
num_examples: 2335
download_size: 1006094564
dataset_size: 1364555763.88
- config_name: male
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 1064641765.528
num_examples: 1956
download_size: 781072069
dataset_size: 1064641765.528
configs:
- config_name: female
data_files:
- split: train
path: female/train-*
- config_name: male
data_files:
- split: train
path: male/train-*
license: cc-by-sa-4.0
task_categories:
- text-to-speech
- text-to-audio
language:
- ta
pretty_name: Tamil Speech
---
# Dataset Card for Tamil Speech
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Statistics](#data-statistics)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Crowdsourced high-quality Tamil multi-speaker speech data set.](https://www.openslr.org/65/)
- **Repository:** [Google Language Resources and Tools](https://github.com/google/language-resources)
- **Paper:** [Open-source Multi-speaker Speech Corpora for Building Gujarati, Kannada, Malayalam, Marathi, Tamil and Telugu Speech Synthesis Systems](https://aclanthology.org/2020.lrec-1.804/)
### Dataset Summary
This dataset consists of 7 hours of transcribed high-quality audio of Tamil sentences recorded by 50 volunteers. The dataset is intended for speech technologies.
The data archives were restructured from the original ones from [OpenSLR](http://www.openslr.org/65/) to make it easier to stream.
### Supported Tasks
- `text-to-speech`, `text-to-audio`: The dataset can be used to train a model for Text-To-Speech (TTS).
- `automatic-speech-recognition`, `speaker-identification`: The dataset can also be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER).
### How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the female config, simply specify the corresponding language config name (i.e., "female" for female speakers):
```python
from datasets import load_dataset
dataset =load_dataset("ylacombe/google-tamil", "female", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
dataset =load_dataset("ylacombe/google-tamil", "female", split="train", streaming=True)
print(next(iter(dataset)))
```
#### *Bonus*
You can create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
**Local:**
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
dataset =load_dataset("ylacombe/google-tamil", "female", split="train")
batch_sampler = BatchSampler(RandomSampler(dataset), batch_size=32, drop_last=False)
dataloader = DataLoader(dataset, batch_sampler=batch_sampler)
```
**Streaming:**
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
dataset =load_dataset("ylacombe/google-tamil", "female", split="train", streaming=True)
dataloader = DataLoader(dataset, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file called `audio` and its transcription, called `text`. Some additional information about the speaker and the passage which contains the transcription is provided.
```
{'audio': {'path': 'taf_02345_00348037167.wav', 'array': array([-9.15527344e-05, -9.15527344e-05, -1.22070312e-04, ...,
-3.05175781e-05, 0.00000000e+00, 3.05175781e-05]), 'sampling_rate': 48000}, 'text': 'ஆஸ்த்ரேலியப் பெண்ணுக்கு முப்பத்தி மூன்று ஆண்டுகளுக்குப் பின்னர் இந்தியா இழப்பீடு வழங்கியது', 'speaker_id': 2345}
```
### Data Fields
- audio: A dictionary containing the audio filename, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- text: the transcription of the audio file.
- speaker_id: unique id of the speaker. The same speaker id can be found for multiple data samples.
### Data Statistics
| | Total duration (h) | Average duration (s) | # speakers | # sentences | # total words | # unique words | # total syllables | # unique syllables | # total phonemes | # unique phonemes |
|--------|--------------------|----------------------|------------|-------------|---------------|----------------|-------------------|--------------------|------------------|-------------------|
| Female | 4.01 | 6.18 | 25 | 2,335 | 15,880 | 6,620 | 56,607 | 1,696 | 126,659 | 37 |
| Male | 3.07 | 5.66 | 25 | 1,956 | 13,545 | 6,159 | 48,049 | 1,642 | 107,570 | 37 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
License: ([CC BY-SA 4.0 DEED](https://creativecommons.org/licenses/by-sa/4.0/deed.en))
### Citation Information
```
@inproceedings{he-etal-2020-open,
title = {{Open-source Multi-speaker Speech Corpora for Building Gujarati, Kannada, Malayalam, Marathi, Tamil and Telugu Speech Synthesis Systems}},
author = {He, Fei and Chu, Shan-Hui Cathy and Kjartansson, Oddur and Rivera, Clara and Katanova, Anna and Gutkin, Alexander and Demirsahin, Isin and Johny, Cibu and Jansche, Martin and Sarin, Supheakmungkol and Pipatsrisawat, Knot},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference (LREC)},
month = may,
year = {2020},
address = {Marseille, France},
publisher = {European Language Resources Association (ELRA)},
pages = {6494--6503},
url = {https://www.aclweb.org/anthology/2020.lrec-1.800},
ISBN = "{979-10-95546-34-4},
}
```
### Contributions
Thanks to [@ylacombe](https://github.com/ylacombe) for adding this dataset. |
yzhuang/autotree_automl_10000_pol_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 236440000
num_examples: 10000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 50839806
dataset_size: 472880000
---
# Dataset Card for "autotree_automl_10000_pol_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Kukedlc__Jupiter-k-7B-slerp | ---
pretty_name: Evaluation run of Kukedlc/Jupiter-k-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/Jupiter-k-7B-slerp](https://huggingface.co/Kukedlc/Jupiter-k-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__Jupiter-k-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T03:09:40.468435](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Jupiter-k-7B-slerp/blob/main/results_2024-03-22T03-09-40.468435.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6562920713057232,\n\
\ \"acc_stderr\": 0.031995894016361555,\n \"acc_norm\": 0.6556528588763255,\n\
\ \"acc_norm_stderr\": 0.03266732171630771,\n \"mc1\": 0.5936352509179926,\n\
\ \"mc1_stderr\": 0.017193835812093886,\n \"mc2\": 0.739600519741522,\n\
\ \"mc2_stderr\": 0.014426830459975894\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520767,\n\
\ \"acc_norm\": 0.742320819112628,\n \"acc_norm_stderr\": 0.0127807705627684\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7134037044413464,\n\
\ \"acc_stderr\": 0.004512471612415587,\n \"acc_norm\": 0.8881696873132842,\n\
\ \"acc_norm_stderr\": 0.003145134767702312\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652456,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.02415222596280158,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.02415222596280158\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993469,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993469\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.016598022120580428,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.016598022120580428\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.012756933382823698,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.012756933382823698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5936352509179926,\n\
\ \"mc1_stderr\": 0.017193835812093886,\n \"mc2\": 0.739600519741522,\n\
\ \"mc2_stderr\": 0.014426830459975894\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.009968715765479651\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \
\ \"acc_stderr\": 0.012560698010954774\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/Jupiter-k-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|arc:challenge|25_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|gsm8k|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hellaswag|10_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-09-40.468435.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T03-09-40.468435.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- '**/details_harness|winogrande|5_2024-03-22T03-09-40.468435.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T03-09-40.468435.parquet'
- config_name: results
data_files:
- split: 2024_03_22T03_09_40.468435
path:
- results_2024-03-22T03-09-40.468435.parquet
- split: latest
path:
- results_2024-03-22T03-09-40.468435.parquet
---
# Dataset Card for Evaluation run of Kukedlc/Jupiter-k-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/Jupiter-k-7B-slerp](https://huggingface.co/Kukedlc/Jupiter-k-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__Jupiter-k-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T03:09:40.468435](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Jupiter-k-7B-slerp/blob/main/results_2024-03-22T03-09-40.468435.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6562920713057232,
"acc_stderr": 0.031995894016361555,
"acc_norm": 0.6556528588763255,
"acc_norm_stderr": 0.03266732171630771,
"mc1": 0.5936352509179926,
"mc1_stderr": 0.017193835812093886,
"mc2": 0.739600519741522,
"mc2_stderr": 0.014426830459975894
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520767,
"acc_norm": 0.742320819112628,
"acc_norm_stderr": 0.0127807705627684
},
"harness|hellaswag|10": {
"acc": 0.7134037044413464,
"acc_stderr": 0.004512471612415587,
"acc_norm": 0.8881696873132842,
"acc_norm_stderr": 0.003145134767702312
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652456,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.02415222596280158,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.02415222596280158
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993469,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993469
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580428,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580428
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823698,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5936352509179926,
"mc1_stderr": 0.017193835812093886,
"mc2": 0.739600519741522,
"mc2_stderr": 0.014426830459975894
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.009968715765479651
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954774
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HuggingFaceM4/imagenet1k_support_1k_query_sets_part_0 | Invalid username or password. |
Rewcifer/trainset1_2000_cutoff_llama | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 249703784.98341143
num_examples: 50000
download_size: 45211692
dataset_size: 249703784.98341143
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "trainset1_2000_cutoff_llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/beatrix_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of beatrix/べあとりくす (Granblue Fantasy)
This is the dataset of beatrix/べあとりくす (Granblue Fantasy), containing 386 images and their tags.
The core tags of this character are `brown_hair, long_hair, breasts, ponytail, large_breasts, bangs, brown_eyes, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 386 | 473.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 386 | 302.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 875 | 602.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 386 | 431.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 875 | 809.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/beatrix_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, cleavage, smile, solo, looking_at_viewer, navel, open_mouth, blush, bare_shoulders, blue_bikini, sunglasses, eyewear_on_head, collarbone, thigh_strap, day, outdoors, sky, wrist_cuffs, ;d, green_eyes, one_eye_closed, asymmetrical_bangs |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, navel, solo, blush, simple_background, smile, blue_bikini, detached_collar, cowboy_shot, halterneck, open_mouth, swept_bangs, white_background |
| 2 | 10 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, cleavage, simple_background, tears, navel, white_background, open_mouth, torn_clothes, upper_body |
| 3 | 6 |  |  |  |  |  | 1girl, blue_ribbon, hair_ribbon, looking_at_viewer, solo, upper_body, simple_background, white_background, asymmetrical_bangs, cleavage, armor, blush, medium_breasts, smile |
| 4 | 5 |  |  |  |  |  | 1girl, armored_boots, cleavage, full_body, gauntlets, holding_sword, looking_at_viewer, midriff, navel, short_shorts, smile, solo, standing, thighhighs, belt, closed_mouth, simple_background, white_background, black_shorts, shoulder_armor, asymmetrical_bangs, floating_hair, gloves, green_eyes, medium_breasts, stomach, thigh_gap |
| 5 | 29 |  |  |  |  |  | 1girl, solo, witch_hat, cleavage, looking_at_viewer, black_gloves, blush, detached_sleeves, bare_shoulders, navel, smile, halloween_costume, open_mouth, midriff, striped_thighhighs |
| 6 | 5 |  |  |  |  |  | 1girl, blush, hetero, nipples, open_mouth, penis, sex, vaginal, 1boy, navel, solo_focus, spread_legs, collarbone, mosaic_censoring, white_background, arms_behind_back, asymmetrical_bangs, belt, boots, cum_in_mouth, cum_in_pussy, cum_on_breasts, facial, heavy_breathing, simple_background, tears, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | smile | solo | looking_at_viewer | navel | open_mouth | blush | bare_shoulders | blue_bikini | sunglasses | eyewear_on_head | collarbone | thigh_strap | day | outdoors | sky | wrist_cuffs | ;d | green_eyes | one_eye_closed | asymmetrical_bangs | simple_background | detached_collar | cowboy_shot | halterneck | swept_bangs | white_background | tears | torn_clothes | upper_body | blue_ribbon | hair_ribbon | armor | medium_breasts | armored_boots | full_body | gauntlets | holding_sword | midriff | short_shorts | standing | thighhighs | belt | closed_mouth | black_shorts | shoulder_armor | floating_hair | gloves | stomach | thigh_gap | witch_hat | black_gloves | detached_sleeves | halloween_costume | striped_thighhighs | hetero | nipples | penis | sex | vaginal | 1boy | solo_focus | spread_legs | mosaic_censoring | arms_behind_back | boots | cum_in_mouth | cum_in_pussy | cum_on_breasts | facial | heavy_breathing | very_long_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------|:-------|:--------------------|:--------|:-------------|:--------|:-----------------|:--------------|:-------------|:------------------|:-------------|:--------------|:------|:-----------|:------|:--------------|:-----|:-------------|:-----------------|:---------------------|:--------------------|:------------------|:--------------|:-------------|:--------------|:-------------------|:--------|:---------------|:-------------|:--------------|:--------------|:--------|:-----------------|:----------------|:------------|:------------|:----------------|:----------|:---------------|:-----------|:-------------|:-------|:---------------|:---------------|:-----------------|:----------------|:---------|:----------|:------------|:------------|:---------------|:-------------------|:--------------------|:---------------------|:---------|:----------|:--------|:------|:----------|:-------|:-------------|:--------------|:-------------------|:-------------------|:--------|:---------------|:---------------|:-----------------|:---------|:------------------|:-----------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | | | X | | | | | | | | | | | | | | X | X | | | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | X | | X | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 29 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | X | X | X | | | | | X | | | | | | | | | X | X | | | | | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Seanxh/twitter_dataset_1713206109 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 141882
num_examples: 333
download_size: 53456
dataset_size: 141882
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Capsekai/DuskfallcrewPhotography | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
- image-classification
language:
- en
tags:
- new zealand
- photography
pretty_name: Duskfall Photography
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **https://duskfallcrew.carrd.co/:**
- **https://discord.gg/Da7s8d3KJ7**
### Dataset Summary
A mixture of photography and other goods from Dusfkallcrew that has been either curated or taken by duskfall crew. Some may or may not be AI generated.
This template was generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Languages
English mainly, but that's because the data is largely of New Zealand.
### Source Data
### Personal and Sensitive Information
No personal data has been included in this data, it is ALL a mixture of AI generated and personally created photography.
If data is not from what is said, then the data set will be cleaned of any errors.
## Considerations for Using the Data
### Social Impact of Dataset
Too much time on my hands.
### Discussion of Biases
It's a DSLR, it's a samsung phne - its' a BIRD ITS A - you get my point. There shoudl be no bias other than where I can actually take photos.
### Licensing Information
Do not sell this dataset, however you may use it as you see fit in TEXT TO IMAGE stable diffusion models.
Your outputs are your own, and the datawithin is free to be used for AI generation models.
### Citation Information
None needed.
### Contributions
If you'd like to contribute please do so! |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-37000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 637064
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hotal/cowrie_logs | ---
dataset_info:
features:
- name: system
dtype: string
- name: command
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2014066
num_examples: 5150
download_size: 119760
dataset_size: 2014066
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibivibiv/alpaca_lamini16 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 56195170
num_examples: 129280
download_size: 36322069
dataset_size: 56195170
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nataliaElv/oasst_quality_with_suggestions | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for oasst_quality_with_suggestions
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("nataliaElv/oasst_quality_with_suggestions")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("nataliaElv/oasst_quality_with_suggestions")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| prompt | Prompt | text | True | False |
| response | Response | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| quality | Rate the quality of the response: | rating | True | N/A | [1, 2, 3, 4, 5] |
| inappropriate-content | Does the response include any of the following? | multi_label_selection | True | N/A | ['Hate speech', 'Sexual content', 'Violent content', 'Untruthful information'] |
| corrected-response | If needed, provide a correction to the response | text | False | N/A | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
**✨ NEW** The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"prompt": "Can you explain contrastive learning in machine learning in simple terms for someone new to the field of ML?",
"response": "Sure! Let\u0027s say you want to build a model which can distinguish between images of cats and dogs. You gather your dataset, consisting of many cat and dog pictures. Then you put them through a neural net of your choice, which produces some representation for each image, a sequence of numbers like [0.123, 0.045, 0.334, ...]. The problem is, if your model is unfamiliar with cat and dog images, these representations will be quite random. At one time a cat and a dog picture could have very similar representations (their numbers would be close to each other), while at others two cat images may be represented far apart. In simple terms, the model wouldn\u0027t be able to tell cats and dogs apart. This is where contrastive learning comes in.\n\nThe point of contrastive learning is to take pairs of samples (in this case images of cats and dogs), then train the model to \"pull\" representations of similar pairs (cat-cat or dog-dog) closer to each other and \"push\" representations of different pairs (cat-dog) apart. After doing this for a sufficient number of steps, your model will be able to produce unique, reliable representations for cats and dogs, in essence tell them apart.\n\nThis method is not limited to images, you can typically use it with any dataset that has similar and dissimilar data points."
},
"metadata": {
"hate_speech": 0.18384182587122588,
"sexual_content": 0.1985823986360316,
"untruthful_information": 0.22438455309200378,
"violent_content": 0.15704218457210023
},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"corrected-response": [],
"corrected-response-suggestion": null,
"corrected-response-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"inappropriate-content": [],
"inappropriate-content-suggestion": null,
"inappropriate-content-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"metadata": "{\"hate_speech\": 0.18384182587122588, \"sexual_content\": 0.1985823986360316, \"untruthful_information\": 0.22438455309200378, \"violent_content\": 0.15704218457210023}",
"prompt": "Can you explain contrastive learning in machine learning in simple terms for someone new to the field of ML?",
"quality": [],
"quality-suggestion": null,
"quality-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"response": "Sure! Let\u0027s say you want to build a model which can distinguish between images of cats and dogs. You gather your dataset, consisting of many cat and dog pictures. Then you put them through a neural net of your choice, which produces some representation for each image, a sequence of numbers like [0.123, 0.045, 0.334, ...]. The problem is, if your model is unfamiliar with cat and dog images, these representations will be quite random. At one time a cat and a dog picture could have very similar representations (their numbers would be close to each other), while at others two cat images may be represented far apart. In simple terms, the model wouldn\u0027t be able to tell cats and dogs apart. This is where contrastive learning comes in.\n\nThe point of contrastive learning is to take pairs of samples (in this case images of cats and dogs), then train the model to \"pull\" representations of similar pairs (cat-cat or dog-dog) closer to each other and \"push\" representations of different pairs (cat-dog) apart. After doing this for a sufficient number of steps, your model will be able to produce unique, reliable representations for cats and dogs, in essence tell them apart.\n\nThis method is not limited to images, you can typically use it with any dataset that has similar and dissimilar data points."
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **prompt** is of type `text`.
* **response** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **quality** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* **inappropriate-content** is of type `multi_label_selection` with the following allowed values ['Hate speech', 'Sexual content', 'Violent content', 'Untruthful information'].
* (optional) **corrected-response** is of type `text`.
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **quality-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **inappropriate-content-suggestion** is of type `multi_label_selection` with the following allowed values ['Hate speech', 'Sexual content', 'Violent content', 'Untruthful information'].
* (optional) **corrected-response-suggestion** is of type `text`.
Additionally, we also have two more fields that are optional and are the following:
* **✨ NEW** **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kavsar/az-multiple-choice-questions | ---
dataset_info:
features:
- name: stem
dtype: string
- name: stem_image
dtype: string
- name: options
sequence: string
- name: options_image
sequence: string
- name: true_label
dtype: int64
- name: subject
dtype: string
- name: main_category
dtype: string
- name: sub_category
dtype: string
splits:
- name: train
num_bytes: 10602734
num_examples: 36915
download_size: 3574013
dataset_size: 10602734
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/717aa708 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1337
dataset_size: 184
---
# Dataset Card for "717aa708"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/56527492 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1325
dataset_size: 184
---
# Dataset Card for "56527492"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
coeuslearning/yelp_review_full | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': 1 star
'1': 2 star
'2': 3 stars
'3': 4 stars
'4': 5 stars
- name: text
dtype: string
splits:
- name: train
num_bytes: 483811554
num_examples: 650000
- name: test
num_bytes: 37271188
num_examples: 50000
download_size: 322952367
dataset_size: 521082742
task_categories:
- conversational
- text2text-generation
- question-answering
language:
- en
pretty_name: coeusyelp
size_categories:
- 100K<n<1M
---
# Dataset Card for "yelp_review_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713074969 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13912
num_examples: 29
download_size: 10861
dataset_size: 13912
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713074969"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/cattleya_baudelaire_violetevergarden | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Cattleya Baudelaire/カトレア・ボードレール (Violet Evergarden)
This is the dataset of Cattleya Baudelaire/カトレア・ボードレール (Violet Evergarden), containing 189 images and their tags.
The core tags of this character are `black_hair, long_hair, earrings, purple_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 189 | 132.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_baudelaire_violetevergarden/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 189 | 130.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_baudelaire_violetevergarden/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 316 | 212.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_baudelaire_violetevergarden/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cattleya_baudelaire_violetevergarden',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, necklace, solo, cleavage, large_breasts, dress, bookshelf, looking_at_viewer |
| 1 | 12 |  |  |  |  |  | 1girl, necklace, solo, collarbone, red_jacket, looking_at_viewer, cleavage, anime_coloring, bookshelf, upper_body |
| 2 | 8 |  |  |  |  |  | 1girl, dark, no_humans, solo, indoors, jewelry, monochrome |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | necklace | solo | cleavage | large_breasts | dress | bookshelf | looking_at_viewer | collarbone | red_jacket | anime_coloring | upper_body | dark | no_humans | indoors | jewelry | monochrome |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:-----------|:----------------|:--------|:------------|:--------------------|:-------------|:-------------|:-----------------|:-------------|:-------|:------------|:----------|:----------|:-------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | X | X | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | | | | | | | | | | X | X | X | X | X |
|
Pianokill/SportFashion_512x512 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 8301282461.708
num_examples: 54606
download_size: 4635842533
dataset_size: 8301282461.708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chymaks/Igbo_ner | ---
license: cc-by-nc-2.0
---
|
Jay-Rajput/DIS_IPL_Outcomes | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: outcomes
data_files: outcomes/*.json
dataset_info:
features:
- name: match_id
dtype: string
- name: man_of_the_match
dtype: string
- name: winning_team
dtype: string
splits:
- name: train
num_bytes: 760
num_examples: 20
download_size: 2277
dataset_size: 760
---
---
license: apache-2.0
---
|
NathanRoll/TalkBank_CA_CABNC_4 | ---
dataset_info:
features:
- name: audio
sequence: float32
- name: __index_level_0__
dtype: string
splits:
- name: train
num_bytes: 1619533958
num_examples: 10
download_size: 1622055892
dataset_size: 1619533958
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "TalkBank_CA_CABNC_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zoom12/SriLankaLaw | ---
license: apache-2.0
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 342516
num_examples: 619
download_size: 75915
dataset_size: 342516
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
orca1130/alfred_gt | ---
license: mit
---
|
ShoukanLabs/OpenNiji-Dataset-Aesthetic-Finetune-0-15K | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: url
dtype: string
- name: prompt
dtype: string
- name: style
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 23769793746.39
num_examples: 15085
download_size: 25132319914
dataset_size: 23769793746.39
---
# Dataset Card for "OpenNiji-Dataset-Aesthetic-Finetune-0-15K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/bena_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of bena/ベナ/贝娜 (Arknights)
This is the dataset of bena/ベナ/贝娜 (Arknights), containing 18 images and their tags.
The core tags of this character are `short_hair, horns, blonde_hair, red_eyes, sheep_horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 18 | 23.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bena_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 18 | 20.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bena_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 41.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bena_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bena_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, short_sleeves, looking_at_viewer, solo, holding, open_mouth, shirt, shoes, socks, dress, sailor_collar, :d, choker, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | short_sleeves | looking_at_viewer | solo | holding | open_mouth | shirt | shoes | socks | dress | sailor_collar | :d | choker | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:--------------------|:-------|:----------|:-------------|:--------|:--------|:--------|:--------|:----------------|:-----|:---------|:----------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-v0.1 | ---
pretty_name: Evaluation run of abhishekchohan/mistral-7B-forest-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhishekchohan/mistral-7B-forest-v0.1](https://huggingface.co/abhishekchohan/mistral-7B-forest-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T06:57:46.714861](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-v0.1/blob/main/results_2024-01-23T06-57-46.714861.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6332771095620451,\n\
\ \"acc_stderr\": 0.032195930683822954,\n \"acc_norm\": 0.6396653083892934,\n\
\ \"acc_norm_stderr\": 0.03284463429523817,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.43702115490675825,\n\
\ \"mc2_stderr\": 0.014178223334158307\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256517,\n\
\ \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467325\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6274646484763992,\n\
\ \"acc_stderr\": 0.004824917516374183,\n \"acc_norm\": 0.8313085042820155,\n\
\ \"acc_norm_stderr\": 0.003737138752336941\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"\
acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\"\
: 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062153,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062153\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391545,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391545\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069432,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069432\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077823,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077823\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579823,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579823\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3195530726256983,\n\
\ \"acc_stderr\": 0.015595520294147397,\n \"acc_norm\": 0.3195530726256983,\n\
\ \"acc_norm_stderr\": 0.015595520294147397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422466,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422466\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.01270058240476822,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.01270058240476822\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806304,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806304\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.43702115490675825,\n\
\ \"mc2_stderr\": 0.014178223334158307\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3555724033358605,\n \
\ \"acc_stderr\": 0.013185402252713852\n }\n}\n```"
repo_url: https://huggingface.co/abhishekchohan/mistral-7B-forest-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|arc:challenge|25_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|gsm8k|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hellaswag|10_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T06-57-46.714861.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T06-57-46.714861.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- '**/details_harness|winogrande|5_2024-01-23T06-57-46.714861.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T06-57-46.714861.parquet'
- config_name: results
data_files:
- split: 2024_01_23T06_57_46.714861
path:
- results_2024-01-23T06-57-46.714861.parquet
- split: latest
path:
- results_2024-01-23T06-57-46.714861.parquet
---
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-forest-v0.1](https://huggingface.co/abhishekchohan/mistral-7B-forest-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T06:57:46.714861](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest-v0.1/blob/main/results_2024-01-23T06-57-46.714861.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6332771095620451,
"acc_stderr": 0.032195930683822954,
"acc_norm": 0.6396653083892934,
"acc_norm_stderr": 0.03284463429523817,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.43702115490675825,
"mc2_stderr": 0.014178223334158307
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256517,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467325
},
"harness|hellaswag|10": {
"acc": 0.6274646484763992,
"acc_stderr": 0.004824917516374183,
"acc_norm": 0.8313085042820155,
"acc_norm_stderr": 0.003737138752336941
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062153,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391545,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391545
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069432,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069432
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077823,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077823
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579823,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579823
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3195530726256983,
"acc_stderr": 0.015595520294147397,
"acc_norm": 0.3195530726256983,
"acc_norm_stderr": 0.015595520294147397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422466,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422466
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.01270058240476822,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.01270058240476822
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806304,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806304
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078685,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078685
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.43702115490675825,
"mc2_stderr": 0.014178223334158307
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.3555724033358605,
"acc_stderr": 0.013185402252713852
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
GATE-engine/COCOStuff164K | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
splits:
- name: val
num_bytes: 2431424833.0
num_examples: 5000
- name: train
num_bytes: 57790292141.76
num_examples: 118287
download_size: 39862772718
dataset_size: 60221716974.76
---
# Dataset Card for "COCOStuff164K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cis-lmu/GlotSparse | ---
license: cc0-1.0
language:
- bal
- glk
- brh
- sdh
- kur
- hac
- kiu
- zza
- twi
- fat
- aka
configs:
- config_name: azb_Arab
data_files: "azb_Arab/azb_Arab.csv"
- config_name: bal_Arab
data_files: "bal_Arab/bal_Arab.csv"
- config_name: brh_Arab
data_files: "brh_Arab/brh_Arab.csv"
- config_name: fat_Latn
data_files: "fat_Latn/fat_Latn.csv"
- config_name: glk_Arab
data_files: "glk_Arab/glk_Arab.csv"
- config_name: hac_Arab
data_files: "hac_Arab/hac_Arab.csv"
- config_name: kiu_Latn
data_files: "kiu_Latn/kiu_Latn.csv"
- config_name: sdh_Arab
data_files: "sdh_Arab/sdh_Arab.csv"
- config_name: twi_Latn
data_files: "twi_Latn/twi_Latn.csv"
- config_name: uzs_Arab
data_files: "uzs_Arab/uzs_Arab.csv"
pretty_name: GlotSparse Corpus
---
# GlotSparse Corpus
Collection of news websites in low-resource languages.
- **Homepage:** [homepage](https://github.com/cisnlp/GlotSparse)
- **Repository:** [github](https://github.com/cisnlp/GlotSparse)
- **Paper:** [paper](https://arxiv.org/abs/2310.16248)
- **Point of Contact:** amir@cis.lmu.de
These languages are supported:
```
('azb_Arab', 'South-Azerbaijani_Arab')
('bal_Arab', 'Balochi_Arab')
('brh_Arab', 'Brahui_Arab')
('fat_Latn', 'Fanti_Latn') # aka
('glk_Arab', 'Gilaki_Arab')
('hac_Arab', 'Gurani_Arab')
('kiu_Latn', 'Kirmanjki_Latn') # zza
('sdh_Arab', 'Southern-Kurdish_Arab')
('twi_Latn', 'Twi_Latn') # aka
('uzs_Arab', 'Southern-Uzbek_Arab')
```
## Usage (HF Loader)
Replace `twi_Latn` with your specific language.
```python
from datasets import load_dataset
dataset = load_dataset('cis-lmu/GlotSparse', 'twi_Latn')
print(dataset['train'][0]) # First row of Twi_Latn
```
## Download
If you are not a fan of the HF dataloader or are just interested in a specific language, download it directly:
Replace `twi_Latn` with your specific language.
```python
! wget https://huggingface.co/datasets/cis-lmu/GlotSparse/resolve/main/twi_Latn/twi_Latn.csv
```
## Sources
- **Balochi (bal)**
- News: https://sunnionline.us/balochi/
- Stories: https://kissah.org/
- Deiverse Contents such as poems, stories, posts, etc: https://baask.com/archive/category/balochi/
- **Gilaki (glk)**
- Social Media: The original source of this content is Twitter, but Twitter typically doesn't support Gilaki as part of its language identifier due to gilaki is a low resource language. We obtained this content from a Telegram channel (https://t.me/gilaki_twitter) that re-posts Gilaki Twitter content. The admins of the channel are native Gilaki speakers, and after manual inspection, these tweets are selected. At present, there isn't a readily available mapping for Twitter IDs. The primary reason for reposting Twitter content on Telegram in Iran is the relative ease of access to Telegram compared to Twitter.
- **Brahui (brh)**
- News: https://talarbrahui.com/category/news/ and https://talarbrahui.com/category/articles/
- **Southern-Kurdish (sdh)**
- News: https://shafaq.com/ku/ (Feyli)
- **Gurani (hac)**
- News: https://anfsorani.com/هۆرامی (Hawrami)
- **Kirmanjki (kiu)**
- News: https://anfkirmancki.com/
- **Fanti (fat)**
- News: https://akannews.com/fante/
- **Twi (twi)**
- News: https://akannews.com/asante-twi/
- **South-Azerbaijani (azb)**
- News: https://www.trt.net.tr/turki/
- **Southern Uzbek (uzs)**
- News: https://www.trt.net.tr/afghaniuzbek/
## Tools
To compute the script of each text and removing unwanted langauges we used Glotscript ([code](https://github.com/cisnlp/GlotScript) and [paper](https://arxiv.org/abs/2309.13320)).
## License
We do not own any of the text from which these data has been extracted.
We license the actual packaging, the metadata and the annotations of these data under the cc0-1.0 (waiving all of the rights under copyright law).
If you are a website/dataset owner and do not want your data to be included in this corpra, please send us an email at amir@cis.lmu.de .
## Ethical Considerations
**1. Biases:** The text corpus may reflect the perspectives, opinions, or demographics of its sources or creators. It is important for users to critically evaluate the text in context especially for **news sources** and **social medias** (e.g., sunnionline, twitter, ...).
**2. Representativeness:** While we have aimed for diversity and inclusivity, the text corpus may not fully represent all native speakers. Users should be mindful of any potential underrepresentation.
**3. Ethics:** We acknowledge that the collection and use of text data can have ethical implications. We have strived to handle the data responsibly, but we encourage users to consider the broader ethical implications of their own research or applications.
## Github
We also host a GitHub version with representing similar metadata from other sources:
https://github.com/cisnlp/GlotSparse
## Citation
If you use any part of this code and data in your research, please cite it using the following BibTeX entry.
All the sources related to news, social media, and without mentioned datasets are crawled and compiled in this work.
This work is part of the [GlotLID](https://github.com/cisnlp/GlotLID) project.
```
@inproceedings{
kargaran2023glotlid,
title={{GlotLID}: Language Identification for Low-Resource Languages},
author={Kargaran, Amir Hossein and Imani, Ayyoob and Yvon, Fran{\c{c}}ois and Sch{\"u}tze, Hinrich},
booktitle={The 2023 Conference on Empirical Methods in Natural Language Processing},
year={2023},
url={https://openreview.net/forum?id=dl4e3EBz5j}
}
``` |
jxie/modelnet40-2048 | ---
dataset_info:
features:
- name: inputs
sequence:
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 322555200
num_examples: 9840
- name: test
num_bytes: 80901040
num_examples: 2468
download_size: 296407531
dataset_size: 403456240
---
# Dataset Card for "modelnet40-2048"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot_v4-math-54ae93-2018366739 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot_v4
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-30b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot_v4
dataset_config: mathemakitten--winobias_antistereotype_test_cot_v4
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-30b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_cot_v4
* Config: mathemakitten--winobias_antistereotype_test_cot_v4
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
anaselgourch/AnaSight | ---
license: apache-2.0
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 25146.527777777777
num_examples: 75
- name: test
num_bytes: 11064.472222222223
num_examples: 33
download_size: 28644
dataset_size: 36211.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** ANAS El-GOURCH
- **License:** apache-2.0
### Dataset Sources [optional]
- **Repository:** AnaSight
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Dataset Creation
I created this dataset to train my LLM "Mistral 7b" to be able to answer qustions and have conversations.
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
Personal data
Thank you.
|
7uk3y/train9 | ---
license: mit
---
|
fcabanilla/tobby2 | ---
license: mit
---
|
CyberHarem/medea_fatestaynightufotable | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Medea (Fate Stay Night [UFOTABLE])
This is the dataset of Medea (Fate Stay Night [UFOTABLE]), containing 34 images and their tags.
The core tags of this character are `blue_hair, pointy_ears, purple_lips`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 27.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 21.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 59 | 34.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 27.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 59 | 43.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/medea_fatestaynightufotable',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, solo, lipstick, choker, hood_up, dress, cape |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | lipstick | choker | hood_up | dress | cape |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:---------|:----------|:--------|:-------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X |
|
p1atdev/open2ch | ---
language:
- ja
license: apache-2.0
size_categories:
- 1M<n<10M
task_categories:
- text-generation
- text2text-generation
dataset_info:
- config_name: all-corpus
features:
- name: dialogue
sequence:
- name: speaker
dtype: int8
- name: content
dtype: string
- name: board
dtype: string
splits:
- name: train
num_bytes: 1693355620
num_examples: 8134707
download_size: 868453263
dataset_size: 1693355620
- config_name: all-corpus-cleaned
features:
- name: dialogue
sequence:
- name: speaker
dtype: int8
- name: content
dtype: string
- name: board
dtype: string
splits:
- name: train
num_bytes: 1199092499
num_examples: 6192730
download_size: 615570076
dataset_size: 1199092499
- config_name: livejupiter
features:
- name: dialogue
sequence:
- name: speaker
dtype: int8
- name: content
dtype: string
splits:
- name: train
num_bytes: 1101433134
num_examples: 5943594
download_size: 592924274
dataset_size: 1101433134
- config_name: livejupiter-cleaned
features:
- name: dialogue
sequence:
- name: speaker
dtype: int8
- name: content
dtype: string
splits:
- name: train
num_bytes: 807499499
num_examples: 4650253
download_size: 437414714
dataset_size: 807499499
- config_name: news4vip
features:
- name: dialogue
sequence:
- name: speaker
dtype: int8
- name: content
dtype: string
splits:
- name: train
num_bytes: 420403926
num_examples: 1973817
download_size: 240974172
dataset_size: 420403926
- config_name: news4vip-cleaned
features:
- name: dialogue
sequence:
- name: speaker
dtype: int8
- name: content
dtype: string
splits:
- name: train
num_bytes: 269941607
num_examples: 1402903
download_size: 156934128
dataset_size: 269941607
- config_name: newsplus
features:
- name: dialogue
sequence:
- name: speaker
dtype: int8
- name: content
dtype: string
splits:
- name: train
num_bytes: 56071294
num_examples: 217296
download_size: 32368053
dataset_size: 56071294
- config_name: newsplus-cleaned
features:
- name: dialogue
sequence:
- name: speaker
dtype: int8
- name: content
dtype: string
splits:
- name: train
num_bytes: 33387874
num_examples: 139574
download_size: 19556120
dataset_size: 33387874
- config_name: ranking
features:
- name: dialogue
sequence:
- name: speaker
dtype: int8
- name: content
dtype: string
- name: next
struct:
- name: speaker
dtype: int8
- name: content
dtype: string
- name: random
sequence: string
splits:
- name: train
num_bytes: 1605628
num_examples: 2000
- name: test
num_bytes: 1604356
num_examples: 1953
download_size: 2127033
dataset_size: 3209984
configs:
- config_name: all-corpus
data_files:
- split: train
path: all-corpus/train-*
- config_name: all-corpus-cleaned
data_files:
- split: train
path: all-corpus-cleaned/train-*
- config_name: livejupiter
data_files:
- split: train
path: livejupiter/train-*
- config_name: livejupiter-cleaned
data_files:
- split: train
path: livejupiter-cleaned/train-*
- config_name: news4vip
data_files:
- split: train
path: news4vip/train-*
- config_name: news4vip-cleaned
data_files:
- split: train
path: news4vip-cleaned/train-*
- config_name: newsplus
data_files:
- split: train
path: newsplus/train-*
- config_name: newsplus-cleaned
data_files:
- split: train
path: newsplus-cleaned/train-*
- config_name: ranking
data_files:
- split: train
path: ranking/train-*
- split: test
path: ranking/test-*
tags:
- not-for-all-audiences
---
# おーぷん2ちゃんねる対話コーパス
## Dataset Details
### Dataset Description
[おーぷん2ちゃんねる対話コーパス](https://github.com/1never/open2ch-dialogue-corpus) を Huggingface Datasets 向けに変換したものになります。
- **Curated by:** [More Information Needed]
- **Language:** Japanese
- **License:** Apache-2.0
### Dataset Sources
- **Repository:** https://github.com/1never/open2ch-dialogue-corpus
## Dataset Structure
- `all-corpus`: `livejupiter`, `news4vip`, `newsplus` サブセットを連結したもの
- `dialogue`: 対話データ (`list[dict]`)
- `speaker`: 話者番号。`1` または `2`。
- `content`: 発言内容
- `board`: 連結元のサブセット名
- `livejupiter`: オリジナルのデータセットでの `livejupiter.tsv` から変換されたデータ。
- `dialogue`: 対話データ (`list[dict]`)
- `speaker`: 話者番号。`1` または `2`。
- `content`: 発言内容
- `news4vip`: オリジナルのデータセットでの `news4vip.tsv` から変換されたデータ。
- 構造は同上
- `newsplus`: オリジナルのデータセットでの `newsplus.tsv` から変換されたデータ。
- 構造は同上
- `ranking`: 応答順位付けタスク用データ (オリジナルデータセットでの `ranking.zip`)
- `train` と `test` split があり、それぞれはオリジナルデータセットの `dev.tsv` と `test.tsv` に対応します。
- `dialogue`: 対話データ (`list[dict]`)
- `speaker`: 話者番号。`1` または `2`。
- `content`: 発言内容
- `next`: 対話の次に続く正解の応答 (`dict`)
- `speaker`: 話者番号。`1` または `2`
- `content`: 発言内容
- `random`: ランダムに選ばれた応答 9 個 (`list[str]`)
また、`all-corpus`, `livejupiter`, `news4vip`, `newsplus` にはそれぞれ名前に `-cleaned` が付与されたバージョンがあり、これらのサブセットではオリジナルのデータセットで配布されていた NG ワードリストを利用してフィルタリングされたものです。
オリジナルのデータセットでは各発言内の改行は `__BR__` に置換されていますが、このデータセットではすべて `\n` に置き換えられています。
## Dataset Creation
### Source Data
(オリジナルのデータセットの説明より)
> おーぷん2ちゃんねるの「なんでも実況(ジュピター)」「ニュー速VIP」「ニュース速報+」の3つの掲示板をクロールして作成した対話コーパスです. おーぷん2ちゃんねる開設時から2019年7月20日までのデータを使用して作成しました.
#### Data Collection and Processing
[オリジナルのデータセット](https://github.com/1never/open2ch-dialogue-corpus) を参照。
#### Personal and Sensitive Information
`-cleaned` ではないサブセットでは、非常に不適切な表現が多いため注意が必要です。
## Usage
```py
from datasets import load_dataset
ds = load_dataset(
"p1atdev/open2ch",
name="all-corpus",
)
print(ds)
print(ds["train"][0])
# DatasetDict({
# train: Dataset({
# features: ['dialogue', 'board'],
# num_rows: 8134707
# })
# })
# {'dialogue': {'speaker': [1, 2], 'content': ['実況スレをたてる', 'おんj民の鑑']}, 'board': 'livejupiter'}
``` |
freshpearYoon/train_free_15 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604551424
num_examples: 10000
download_size: 1168717994
dataset_size: 9604551424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nicholasbien/lakh-txt-full-v2-tokenized | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1568058800
num_examples: 13560
- name: test
num_bytes: 388986115
num_examples: 3390
download_size: 674976322
dataset_size: 1957044915
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
dot-ammar/AR-multi-dotted-Small | ---
dataset_info:
features:
- name: Masked
dtype: string
- name: Options
sequence: string
- name: Target
dtype: string
splits:
- name: train
num_bytes: 11340965377
num_examples: 500000
download_size: 1916493752
dataset_size: 11340965377
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_UCLA-AGI__test-test | ---
pretty_name: Evaluation run of UCLA-AGI/test-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [UCLA-AGI/test-test](https://huggingface.co/UCLA-AGI/test-test) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__test-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T03:24:05.759125](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test-test/blob/main/results_2024-01-06T03-24-05.759125.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6137450258527077,\n\
\ \"acc_stderr\": 0.03285635549826058,\n \"acc_norm\": 0.619742776234521,\n\
\ \"acc_norm_stderr\": 0.03352418559465981,\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.017225627083660867,\n \"mc2\": 0.5774588897502617,\n\
\ \"mc2_stderr\": 0.015854382987078947\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n\
\ \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.013796182947785562\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6748655646285601,\n\
\ \"acc_stderr\": 0.004674677287148618,\n \"acc_norm\": 0.858195578570006,\n\
\ \"acc_norm_stderr\": 0.003481364840770976\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114968,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114968\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n\
\ \"acc_stderr\": 0.016051419760310267,\n \"acc_norm\": 0.35977653631284917,\n\
\ \"acc_norm_stderr\": 0.016051419760310267\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.02641560191438899,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.02641560191438899\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719967,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719967\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.012682016335646671,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.012682016335646671\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111844,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111844\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.017225627083660867,\n \"mc2\": 0.5774588897502617,\n\
\ \"mc2_stderr\": 0.015854382987078947\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836675\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32752084912812734,\n \
\ \"acc_stderr\": 0.012927102210426476\n }\n}\n```"
repo_url: https://huggingface.co/UCLA-AGI/test-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|arc:challenge|25_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|arc:challenge|25_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|arc:challenge|25_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|arc:challenge|25_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|gsm8k|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|gsm8k|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|gsm8k|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|gsm8k|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hellaswag|10_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hellaswag|10_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hellaswag|10_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hellaswag|10_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T20-55-50.355988.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T21-43-53.748756.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T00-47-43.486217.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T03-24-05.759125.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T03-24-05.759125.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- '**/details_harness|winogrande|5_2024-01-05T20-55-50.355988.parquet'
- split: 2024_01_05T21_43_53.748756
path:
- '**/details_harness|winogrande|5_2024-01-05T21-43-53.748756.parquet'
- split: 2024_01_06T00_47_43.486217
path:
- '**/details_harness|winogrande|5_2024-01-06T00-47-43.486217.parquet'
- split: 2024_01_06T03_24_05.759125
path:
- '**/details_harness|winogrande|5_2024-01-06T03-24-05.759125.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T03-24-05.759125.parquet'
- config_name: results
data_files:
- split: 2024_01_05T20_55_50.355988
path:
- results_2024-01-05T20-55-50.355988.parquet
- split: 2024_01_05T21_43_53.748756
path:
- results_2024-01-05T21-43-53.748756.parquet
- split: 2024_01_06T00_47_43.486217
path:
- results_2024-01-06T00-47-43.486217.parquet
- split: 2024_01_06T03_24_05.759125
path:
- results_2024-01-06T03-24-05.759125.parquet
- split: latest
path:
- results_2024-01-06T03-24-05.759125.parquet
---
# Dataset Card for Evaluation run of UCLA-AGI/test-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/test-test](https://huggingface.co/UCLA-AGI/test-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__test-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T03:24:05.759125](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test-test/blob/main/results_2024-01-06T03-24-05.759125.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6137450258527077,
"acc_stderr": 0.03285635549826058,
"acc_norm": 0.619742776234521,
"acc_norm_stderr": 0.03352418559465981,
"mc1": 0.4112607099143207,
"mc1_stderr": 0.017225627083660867,
"mc2": 0.5774588897502617,
"mc2_stderr": 0.015854382987078947
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.013796182947785562
},
"harness|hellaswag|10": {
"acc": 0.6748655646285601,
"acc_stderr": 0.004674677287148618,
"acc_norm": 0.858195578570006,
"acc_norm_stderr": 0.003481364840770976
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763079,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763079
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310267,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310267
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.02641560191438899,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.02641560191438899
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719967,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.012682016335646671,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.012682016335646671
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111844,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111844
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4112607099143207,
"mc1_stderr": 0.017225627083660867,
"mc2": 0.5774588897502617,
"mc2_stderr": 0.015854382987078947
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836675
},
"harness|gsm8k|5": {
"acc": 0.32752084912812734,
"acc_stderr": 0.012927102210426476
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bezzam/lensless-test | ---
dataset_info:
features:
- name: lensless
dtype: image
- name: lensed
dtype: image
splits:
- name: train
num_bytes: 12891178
num_examples: 2
- name: test
num_bytes: 6944240
num_examples: 1
download_size: 19839183
dataset_size: 19835418
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: mit
task_categories:
- image-to-image
tags:
- lensless
size_categories:
- n<1K
--- |
toki64/rnabert_dataset_small | ---
license: openrail
---
|
nkandpa2/qa_entities | ---
license: bigscience-openrail-m
---
|
CATIE-AQ/squad_v2_french_translated_fr_prompt_question_generation_with_context | ---
language:
- fr
license: apache-2.0
size_categories:
- 1M<n<10M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- squad_v2_french_translated
---
# squad_v2_french_translated_fr_prompt_question_generation_with_context
## Summary
**squad_v2_french_translated_fr_prompt_question_generation_with_context** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **3,795,312** rows that can be used for a question-generation (with context) task.
The original data (without prompts) comes from the dataset [pragnakalp/squad_v2_french_translated](https://huggingface.co/datasets/pragnakalp/squad_v2_french_translated) and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
24 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'"'+context+'"\n Générer une question à partir du texte ci-dessus : ',
'"'+context+'"\n Génère une question à partir du texte ci-dessus : ',
'"'+context+'"\n Générez une question à partir du texte ci-dessus : ',
'"'+context+'"\n Trouver une question à partir du texte ci-dessus : ',
'"'+context+'"\n Trouve une question à partir du texte ci-dessus : ',
'"'+context+'"\n Trouvez une question à partir du texte ci-dessus : ',
'"'+context+'"\n Créer une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Crée trouver une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Créez trouver une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Ecrire une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Ecris une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Ecrivez une bonne question à partir du texte ci-dessus : ',
'Générer une bonne question pour le texte suivant : "'+context+'"',
'Génère une bonne question pour le texte suivant : "'+context+'"',
'Générez une bonne question pour le texte suivant : "'+context+'"',
'Trouver une bonne question pour le texte suivant : "'+context+'"',
'Trouve une bonne question pour le texte suivant : "'+context+'"',
'Trouvez trouver une bonne question pour le texte suivant : "'+context+'"',
'Créer une bonne question pour le texte suivant : "'+context+'"',
'Crée trouver une bonne question pour le texte suivant : "'+context+'"',
'Créez trouver une bonne question pour le texte suivant : "'+context+'"',
'Ecrire une bonne question pour le texte suivant : "'+context+'"',
'Ecris une bonne question pour le texte suivant : "'+context+'"',
'Ecrivez une bonne question pour le texte suivant : "'+context+'"'
```
# Splits
- `train` with 3,795,312 samples
- no `valid` split
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/squad_v2_french_translated_fr_prompt_question_generation_with_context")
```
# Citation
## Original data
> Hugging Face repository: https://huggingface.co/datasets/pragnakalp/squad_v2_french_translated
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
apache-2.0 |
yangyz1230/H3K4me2_not_filtered | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 426630
num_examples: 776
- name: test
num_bytes: 48315
num_examples: 88
download_size: 228842
dataset_size: 474945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
lIlBrother/KsponSpeech-Phonetic | ---
license: apache-2.0
---
|
euclaise/LittleTown | ---
license: other
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 75640201
num_examples: 100000
download_size: 16577014
dataset_size: 75640201
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LittleTown"
[Language models are greedy reasoners](https://arxiv.org/pdf/2210.01240.pdf), so they don't often backtrack. This is a dataset made to teach them backtracking. The data is synthetic, generated randomly in Python.
90% of the examples contain backtracking.
License:
```
Zero-Clause BSD
=============
Permission to use, copy, modify, and/or distribute this software for
any purpose with or without fee is hereby granted.
THE SOFTWARE IS PROVIDED “AS IS” AND THE AUTHOR DISCLAIMS ALL
WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES
OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE
FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY
DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN
AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
``` |
Benson/dolly_dialog_subset | ---
language:
- en
--- |
CreativeLang/EPIC_Irony | ---
dataset_info:
features:
- name: user
dtype: string
- name: label
dtype: string
- name: timestamp
dtype: string
- name: source
dtype: string
- name: subreddit
dtype: string
- name: id_original
dtype: string
- name: text
dtype: string
- name: parent_id_original
dtype: string
- name: parent_text
dtype: string
- name: Language_instance
dtype: string
- name: Language_variety
dtype: string
- name: Age
dtype: string
- name: Sex
dtype: string
- name: Ethnicity simplified
dtype: string
- name: Country of birth
dtype: string
- name: Country of residence
dtype: string
- name: Nationality
dtype: string
- name: Language_annotator
dtype: string
- name: Student status
dtype: string
- name: Employment status
dtype: string
splits:
- name: train
num_bytes: 7299373
num_examples: 14172
download_size: 1038853
dataset_size: 7299373
---
# EPIC_Irony
- paper: [EPIC: Multi-Perspective Annotation of a Corpus of Irony](https://assets.amazon.science/40/b4/0f6ec06a4a33a44485de1b2b57c7/epic-multi-perspective-annotation-of-a-corpus-of-irony.pdf) at ACL 2023
Key features:
- EPIC (English Perspectivist Irony Corpus) is an annotated corpus for irony analysis based on data perspectivism principles.
- The corpus contains social media conversations in five regional varieties of English, annotated by contributors from corresponding countries.
- The dataset explores the perspectives of annotators, taking into account their origin, age, and gender.
- Perspective-aware models were created to validate EPIC, and these proved more effective and confident in identifying irony than non-perspectivist models.
- The models showcase variation in irony perception across different demographic groups.
- EPIC serves as a valuable resource for training perspective-aware models for irony detection.
Metadata in Creative Language Toolkit ([CLTK](https://github.com/liyucheng09/cltk))
- CL Type: Irony
- Task Type: detection
- Size: 14k
- Created time: 2023 |
huggingartists/sum-41 | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/sum-41"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.196472 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/7cf5f61ac4ffe9a0fd1f6a4b235b95eb.320x320x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/sum-41">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Sum 41</div>
<a href="https://genius.com/artists/sum-41">
<div style="text-align: center; font-size: 14px;">@sum-41</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/sum-41).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/sum-41")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|134| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/sum-41")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
BangumiBase/toarukagakunoaccelerator | ---
license: mit
tags:
- art
size_categories:
- n<1K
---
# Bangumi Image Base of Toaru Kagaku No Accelerator
This is the image base of bangumi Toaru Kagaku no Accelerator, we detected 24 characters, 904 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 16 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 12 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 20 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 25 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 68 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 14 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 8 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 17 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 28 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 121 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 12 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 25 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 12 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 137 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 8 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 15 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 25 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 49 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 21 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 24 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 67 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 13 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 7 | [Download](22/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 160 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_localfultonextractor__Erosumika-7B-v2 | ---
pretty_name: Evaluation run of localfultonextractor/Erosumika-7B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [localfultonextractor/Erosumika-7B-v2](https://huggingface.co/localfultonextractor/Erosumika-7B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_localfultonextractor__Erosumika-7B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T19:59:34.099586](https://huggingface.co/datasets/open-llm-leaderboard/details_localfultonextractor__Erosumika-7B-v2/blob/main/results_2024-03-24T19-59-34.099586.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6258885581835653,\n\
\ \"acc_stderr\": 0.032806762757423245,\n \"acc_norm\": 0.6296491064965818,\n\
\ \"acc_norm_stderr\": 0.033463245993051116,\n \"mc1\": 0.5116279069767442,\n\
\ \"mc1_stderr\": 0.01749876717574008,\n \"mc2\": 0.6900096441749616,\n\
\ \"mc2_stderr\": 0.015127251990438156\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6296928327645052,\n \"acc_stderr\": 0.01411129875167495,\n\
\ \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156222\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6946823341963753,\n\
\ \"acc_stderr\": 0.004596006250433554,\n \"acc_norm\": 0.8628759211312488,\n\
\ \"acc_norm_stderr\": 0.0034327529819187983\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851102,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851102\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.01709057380421791,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.01709057380421791\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794087,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794087\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406978,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406978\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7879948914431673,\n\
\ \"acc_stderr\": 0.014616099385833676,\n \"acc_norm\": 0.7879948914431673,\n\
\ \"acc_norm_stderr\": 0.014616099385833676\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48379888268156424,\n\
\ \"acc_stderr\": 0.01671372072950102,\n \"acc_norm\": 0.48379888268156424,\n\
\ \"acc_norm_stderr\": 0.01671372072950102\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5116279069767442,\n\
\ \"mc1_stderr\": 0.01749876717574008,\n \"mc2\": 0.6900096441749616,\n\
\ \"mc2_stderr\": 0.015127251990438156\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091088\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4518574677786202,\n \
\ \"acc_stderr\": 0.013708494995677636\n }\n}\n```"
repo_url: https://huggingface.co/localfultonextractor/Erosumika-7B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|arc:challenge|25_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|gsm8k|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hellaswag|10_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T19-59-34.099586.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T19-59-34.099586.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- '**/details_harness|winogrande|5_2024-03-24T19-59-34.099586.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T19-59-34.099586.parquet'
- config_name: results
data_files:
- split: 2024_03_24T19_59_34.099586
path:
- results_2024-03-24T19-59-34.099586.parquet
- split: latest
path:
- results_2024-03-24T19-59-34.099586.parquet
---
# Dataset Card for Evaluation run of localfultonextractor/Erosumika-7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [localfultonextractor/Erosumika-7B-v2](https://huggingface.co/localfultonextractor/Erosumika-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_localfultonextractor__Erosumika-7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T19:59:34.099586](https://huggingface.co/datasets/open-llm-leaderboard/details_localfultonextractor__Erosumika-7B-v2/blob/main/results_2024-03-24T19-59-34.099586.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6258885581835653,
"acc_stderr": 0.032806762757423245,
"acc_norm": 0.6296491064965818,
"acc_norm_stderr": 0.033463245993051116,
"mc1": 0.5116279069767442,
"mc1_stderr": 0.01749876717574008,
"mc2": 0.6900096441749616,
"mc2_stderr": 0.015127251990438156
},
"harness|arc:challenge|25": {
"acc": 0.6296928327645052,
"acc_stderr": 0.01411129875167495,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156222
},
"harness|hellaswag|10": {
"acc": 0.6946823341963753,
"acc_stderr": 0.004596006250433554,
"acc_norm": 0.8628759211312488,
"acc_norm_stderr": 0.0034327529819187983
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851102,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851102
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396993,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396993
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.01709057380421791,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.01709057380421791
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794087,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406978,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406978
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7879948914431673,
"acc_stderr": 0.014616099385833676,
"acc_norm": 0.7879948914431673,
"acc_norm_stderr": 0.014616099385833676
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48379888268156424,
"acc_stderr": 0.01671372072950102,
"acc_norm": 0.48379888268156424,
"acc_norm_stderr": 0.01671372072950102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5116279069767442,
"mc1_stderr": 0.01749876717574008,
"mc2": 0.6900096441749616,
"mc2_stderr": 0.015127251990438156
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091088
},
"harness|gsm8k|5": {
"acc": 0.4518574677786202,
"acc_stderr": 0.013708494995677636
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_winglian__llama-2-4b | ---
pretty_name: Evaluation run of winglian/llama-2-4b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [winglian/llama-2-4b](https://huggingface.co/winglian/llama-2-4b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_winglian__llama-2-4b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T09:12:20.462280](https://huggingface.co/datasets/open-llm-leaderboard/details_winglian__llama-2-4b/blob/main/results_2023-10-26T09-12-20.462280.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.0004191330178826824,\n \"f1\": 0.039830117449664484,\n\
\ \"f1_stderr\": 0.0011176272810803495,\n \"acc\": 0.2895672680207256,\n\
\ \"acc_stderr\": 0.007874406355324997\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826824,\n\
\ \"f1\": 0.039830117449664484,\n \"f1_stderr\": 0.0011176272810803495\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \
\ \"acc_stderr\": 0.0018535550440036202\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.574585635359116,\n \"acc_stderr\": 0.013895257666646373\n\
\ }\n}\n```"
repo_url: https://huggingface.co/winglian/llama-2-4b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T09_12_20.462280
path:
- '**/details_harness|drop|3_2023-10-26T09-12-20.462280.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T09-12-20.462280.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T09_12_20.462280
path:
- '**/details_harness|gsm8k|5_2023-10-26T09-12-20.462280.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T09-12-20.462280.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-22-33.156570.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-22-33.156570.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T14-22-33.156570.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T09_12_20.462280
path:
- '**/details_harness|winogrande|5_2023-10-26T09-12-20.462280.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T09-12-20.462280.parquet'
- config_name: results
data_files:
- split: 2023_10_03T14_22_33.156570
path:
- results_2023-10-03T14-22-33.156570.parquet
- split: 2023_10_26T09_12_20.462280
path:
- results_2023-10-26T09-12-20.462280.parquet
- split: latest
path:
- results_2023-10-26T09-12-20.462280.parquet
---
# Dataset Card for Evaluation run of winglian/llama-2-4b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/winglian/llama-2-4b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [winglian/llama-2-4b](https://huggingface.co/winglian/llama-2-4b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_winglian__llama-2-4b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T09:12:20.462280](https://huggingface.co/datasets/open-llm-leaderboard/details_winglian__llama-2-4b/blob/main/results_2023-10-26T09-12-20.462280.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826824,
"f1": 0.039830117449664484,
"f1_stderr": 0.0011176272810803495,
"acc": 0.2895672680207256,
"acc_stderr": 0.007874406355324997
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826824,
"f1": 0.039830117449664484,
"f1_stderr": 0.0011176272810803495
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036202
},
"harness|winogrande|5": {
"acc": 0.574585635359116,
"acc_stderr": 0.013895257666646373
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BomboMaster/unl_tesis_linea_investigacion | ---
license: apache-2.0
---
|
minoosh/empathetic_dialogues_forLlama2 | ---
dataset_info:
features:
- name: conv
dtype: string
splits:
- name: train
num_bytes: 10014320
num_examples: 23149
download_size: 5336757
dataset_size: 10014320
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davanstrien/WELFake | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': fake
'1': real
splits:
- name: train
num_bytes: 245239522
num_examples: 72134
download_size: 151915950
dataset_size: 245239522
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
We designed a larger and more generic Word Embedding over Linguistic Features for Fake News Detection (WELFake) dataset of 72,134 news articles with 35,028 real and 37,106 fake news. For this, we merged four popular news datasets (i.e. Kaggle, McIntire, Reuters, BuzzFeed Political) to prevent over-fitting of classifiers and to provide more text data for better ML training.
Dataset contains four columns: Serial number (starting from 0); Title (about the text news heading); Text (about the news content); and Label (0 = fake and 1 = real).
There are 78098 data entries in csv file out of which only 72134 entries are accessed as per the data frame.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
delphi-suite/v0-next-logprobs-llama2-12.8m | ---
dataset_info:
features:
- name: logprobs
sequence: float64
splits:
- name: validation
num_bytes: 45818277
num_examples: 10982
download_size: 37722133
dataset_size: 45818277
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
baptistecolle/sam-controlnet-original | ---
dataset_info:
features:
- name: image
dtype: image
- name: filepath
dtype: string
- name: sentids
list: int32
- name: filename
dtype: string
- name: imgid
dtype: int32
- name: split
dtype: string
- name: sentences
struct:
- name: tokens
list: string
- name: raw
dtype: string
- name: imgid
dtype: int32
- name: sentid
dtype: int32
- name: cocoid
dtype: int32
splits:
- name: original
num_bytes: 160172441.0
num_examples: 1000
download_size: 32080306
dataset_size: 160172441.0
---
# Dataset Card for "sam-controlnet-original"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713209049 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 25677
num_examples: 72
download_size: 22411
dataset_size: 25677
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713209049"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sheelsansare/cs224w-govReportKG | ---
license: apache-2.0
---
|
patimus-prime/strain_selection | ---
license: mit
---
|
AlanYky/hate-with-instruction-with-label | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 3929807
num_examples: 2000
download_size: 1660884
dataset_size: 3929807
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
socjopata99/elo | ---
license: apache-2.0
---
|
MaskinaMaskina/Dreambooth_maskina | ---
license: unknown
---
|
Solshine/Hindi_English_QandA_Synth_Data_For_Hinglish_Project | ---
license: apache-2.0
---
## ReadMe: Hindi-English Synthetic Dataset - Question-Answer Style Dialogues on Ethics, Systems Thinking, and Creative Problem-Solving with High-Level Principles of Mind
**Dataset Description:**
This was created as part of the Hugging Face Hinglish Training Project, coordinated on the Hugging Face Discord.
This dataset presents a human-currated/handcurrated collection of dialogues as well as prompt-answer pairs in Hindi with English translation that focus on ethics, systems thinking, and creative problem-solving, framed within the context of high-level principles of mind. The dataset includes question-answer style dialogues based on diverse domains and topics, including:
This dataset includes the best (human selected) examples prompted to demonstrate excellent Hindi-English-Hinglish translation, systems thinking, structured thinking, Hindi fundamentals, good moral character, ethics fundamentals, coherent question answering, step by step advanced reasoning, helpfulness with tasks relevant to Indian society, some limited Indian cultural archetypes, and step by step Theory-of-mind analysis.
These were generated using the public Gemini-powered version of Google Bard between December 28th 2023 and January 2nd 2023. It was then merged using a Python Pandas concatenation script.
* **Artificial Intelligence and Ethics:** Questions and answers about ethical decision-making in autonomous vehicles, healthcare systems, and language models.
* **Environment and Sustainable Development:** Questions and answers about sustainable agricultural practices, water management, and tackling climate change.
* **Education and Creativity:** Questions and answers about effective teaching methods for developing creative problem-solving skills in students.
* **Analysis of Complex Systems:** Questions and answers about understanding complex ecosystems, microservices architectures, and socio-economic systems.
**Each dialogue includes the following information:**
* **Hindi Question or Prompt:** A complex question related to the topic, many reflecting high-level principles of mind, ethical considerations, and STEM principles.
* **English Translation of Question or Prompt:** A literal translation of the Hindi question.
* **Suggested Response in Hindi:** A detailed response in Hindi incorporating high-level principles of mind with deep analysis, collaboration, and problem-solving.
* **English Translation of Response:** A literal translation of the suggested response.
**Potential Applications:**
This dataset can be useful for the following applications:
* **Training Dialogue Systems and AI Assistants:** Train assistant models to enable informed and deep conversations on ethical and complex topics.
* **Developing Educational Material on Ethics and Systems Thinking:** Design curriculum and training modules to promote ethical and systemic thinking in various domains.
* **Modeling Creative Problem-Solving Strategies:** Develop models for creative thinking and exploring innovative solutions.
* **Improving Hindi-English Machine Translation:** Train language models with complex topics and advanced language usage.
**Important Notes when using the Dataset:**
* There are some blank fields, most often the fourth column (English translation of response) and occasional partial translation rather than the full text of the translation. A single blank field in an otherwise complete row is somewhat common, roughly estimated to be present in between 4% and 0.5% of the observations. This occurs due to the token limit in Bard for each batch of data (there were between 2 and 15 observation per batch, before they were merged with a purpose-built python script.)
* There are also some English ToM (theory of mind) descriptive english artifacts in some of the hindi responses for the more advanced ToM focused observations (may be helpful is some contexts such as using English understandings to process ToM when answering it Hindi. It may also be seen as negative because it is English within the column meant for Hindi, thus cross-polluting.) Nonetheless these errors represent very small marginal portion of the dataset.
* This dataset does not endorse any specific product or service.
* It is helpful to consider ethical principles and accountability when using any dataset.
* The presented responses on various topics do not represent expert opinions but rather aim to inspire potential directions and discussions.
**We hope this dataset inspires you to think about and discuss high-level principles of mind in the context of ethics, systems thinking, and creative problem-solving!**
This example highlights the purpose, structure, potential applications, and important notes for using the dataset, all in English. |
openclimatefix/eumetsat-iodc | ---
license: mit
---
|
MexIvanov/Vezora-Tested-22k-Python-Alpaca-ru | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- ru
tags:
- python
- code
---
A machine translated version of the Vezora/Tested-22k-Python-Alpaca dataset.
Consists of code "Filtered Using Vezora's CodeTester" with code-related data and natural language instructions.
Released under the same license as the original dataset, provided as is with research intent, use/read at your own risk. |
simonottosen/security_data | ---
license: mit
---
|
Multimodal-Fatima/VQAv2_validation_no_image | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_wo_openai
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_with_openai
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_bigG_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: clip_tags_ViT_B_16_with_openai
sequence: string
splits:
- name: validation
num_bytes: 11070187868
num_examples: 214354
download_size: 2794930371
dataset_size: 11070187868
---
# Dataset Card for "VQAv2_validation_no_image"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_psmathur__orca_mini_13b | ---
pretty_name: Evaluation run of psmathur/orca_mini_13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/orca_mini_13b](https://huggingface.co/psmathur/orca_mini_13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T13:38:44.745207](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_13b/blob/main/results_2023-10-15T13-38-44.745207.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03355704697986577,\n\
\ \"em_stderr\": 0.001844249316229893,\n \"f1\": 0.11233116610738275,\n\
\ \"f1_stderr\": 0.002439557952450172,\n \"acc\": 0.3208366219415943,\n\
\ \"acc_stderr\": 0.006738290586283765\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03355704697986577,\n \"em_stderr\": 0.001844249316229893,\n\
\ \"f1\": 0.11233116610738275,\n \"f1_stderr\": 0.002439557952450172\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6416732438831886,\n\
\ \"acc_stderr\": 0.01347658117256753\n }\n}\n```"
repo_url: https://huggingface.co/psmathur/orca_mini_13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|arc:challenge|25_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T13_38_44.745207
path:
- '**/details_harness|drop|3_2023-10-15T13-38-44.745207.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T13-38-44.745207.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T13_38_44.745207
path:
- '**/details_harness|gsm8k|5_2023-10-15T13-38-44.745207.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T13-38-44.745207.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hellaswag|10_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:53:33.020588.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T09:53:33.020588.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T09:53:33.020588.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T13_38_44.745207
path:
- '**/details_harness|winogrande|5_2023-10-15T13-38-44.745207.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T13-38-44.745207.parquet'
- config_name: results
data_files:
- split: 2023_08_09T09_53_33.020588
path:
- results_2023-08-09T09:53:33.020588.parquet
- split: 2023_10_15T13_38_44.745207
path:
- results_2023-10-15T13-38-44.745207.parquet
- split: latest
path:
- results_2023-10-15T13-38-44.745207.parquet
---
# Dataset Card for Evaluation run of psmathur/orca_mini_13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_13b](https://huggingface.co/psmathur/orca_mini_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T13:38:44.745207](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_13b/blob/main/results_2023-10-15T13-38-44.745207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03355704697986577,
"em_stderr": 0.001844249316229893,
"f1": 0.11233116610738275,
"f1_stderr": 0.002439557952450172,
"acc": 0.3208366219415943,
"acc_stderr": 0.006738290586283765
},
"harness|drop|3": {
"em": 0.03355704697986577,
"em_stderr": 0.001844249316229893,
"f1": 0.11233116610738275,
"f1_stderr": 0.002439557952450172
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6416732438831886,
"acc_stderr": 0.01347658117256753
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
vrclc/imasc_slr | ---
dataset_info:
features:
- name: text
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 5727891472.0
num_examples: 41144
download_size: 5607215503
dataset_size: 5727891472.0
---
# Dataset Card for "imasc_slr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_same_length_find_passage_train10_eval10_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10882
num_examples: 30
- name: validation
num_bytes: 3378
num_examples: 10
download_size: 13851
dataset_size: 14260
---
# Dataset Card for "random_letter_same_length_find_passage_train10_eval10_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
turkish-nlp-suite/turkish-wikiNER | ---
language:
- tr
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: Turkish-WikiNER
---
# Dataset Card for "turkish-nlp-suite/turkish-wikiNER"
<img src="https://raw.githubusercontent.com/turkish-nlp-suite/.github/main/profile/wiki.png" width="20%" height="20%">
## Dataset Description
- **Repository:** [Turkish-WikiNER](https://github.com/turkish-nlp-suite/Turkish-Wiki-NER-Dataset)
- **Paper:** [ACL link]()
- **Dataset:** Turkish-WikiNER
- **Domain:** Wiki
- **Number of Labels:** 18
### Dataset Summary
Turkish NER dataset from Wikipedia sentences. 20.000 sentences are sampled and re-annotated from [Kuzgunlar NER dataset](https://data.mendeley.com/datasets/cdcztymf4k/1).
Annotations are done by [Co-one](https://co-one.co/). Many thanks to them for their contributions. This dataset is also used in our brand new spaCy Turkish packages.
### Dataset Instances
An instance of this dataset looks as follows:
```
{
"tokens": ["Çekimler", "5", "Temmuz", "2005", "tarihinde", "Reebok", "Stadyum", ",", "Bolton", ",", "İngiltere'de", "yapılmıştır", "."],
"tags": [O", "B-DATE", "I-DATE", "I-DATE", "O", "B-FAC", "I-FAC", "O", "B-GPE", "O", "B-GPE", "O", "O"]
}
```
or even better:

### Labels
- CARDINAL
- DATE
- EVENT
- FAC
- GPE
- LANGUAGE
- LAW
- LOC
- MONEY
- NORP
- ORDINAL
- ORG
- PERCENT
- PERSON
- PRODUCT
- QUANTITY
- TIME
- TITLE
- WORK_OF_ART
### Data Split
| name |train|validation|test|
|---------|----:|---------:|---:|
|Turkish-WikiNER|18000| 1000|1000|
### Citation
This work is supported by Google Developer Experts Program. Part of Duygu 2022 Fall-Winter collection, "Turkish NLP with Duygu"/ "Duygu'yla Türkçe NLP". All rights reserved. If you'd like to use this dataset in your own work, please kindly cite [A Diverse Set of Freely Available Linguistic Resources for Turkish](https://aclanthology.org/2023.acl-long.768/) :
```
@inproceedings{altinok-2023-diverse,
title = "A Diverse Set of Freely Available Linguistic Resources for {T}urkish",
author = "Altinok, Duygu",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.768",
pages = "13739--13750",
abstract = "This study presents a diverse set of freely available linguistic resources for Turkish natural language processing, including corpora, pretrained models and education material. Although Turkish is spoken by a sizeable population of over 80 million people, Turkish linguistic resources for natural language processing remain scarce. In this study, we provide corpora to allow practitioners to build their own applications and pretrained models that would assist industry researchers in creating quick prototypes. The provided corpora include named entity recognition datasets of diverse genres, including Wikipedia articles and supplement products customer reviews. In addition, crawling e-commerce and movie reviews websites, we compiled several sentiment analysis datasets of different genres. Our linguistic resources for Turkish also include pretrained spaCy language models. To the best of our knowledge, our models are the first spaCy models trained for the Turkish language. Finally, we provide various types of education material, such as video tutorials and code examples, that can support the interested audience on practicing Turkish NLP. The advantages of our linguistic resources are three-fold: they are freely available, they are first of their kind, and they are easy to use in a broad range of implementations. Along with a thorough description of the resource creation process, we also explain the position of our resources in the Turkish NLP world.",
}
```
|
seanghay/SynthKhmer-10k | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
struct:
- name: bbox
list:
- name: h
dtype: float64
- name: id
dtype: string
- name: w
dtype: float64
- name: x
dtype: float64
- name: 'y'
dtype: float64
- name: dateOfBirth
dtype: string
- name: fullName
dtype: string
- name: gender
dtype: string
- name: id
dtype: string
- name: label_dateOfBirth
dtype: string
- name: label_fullName
dtype: string
- name: label_gender
dtype: string
splits:
- name: train
num_bytes: 1172837507
num_examples: 10000
download_size: 1168963461
dataset_size: 1172837507
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-4.0
task_categories:
- image-to-text
language:
- km
pretty_name: SynthKhmer-10k
size_categories:
- 10K<n<100K
---
## SynthKhmer 10k
Image Size: 896x672
## Dataset Creation
- Background images are from `https://picsum.photos/`
- Profile Photos are from `https://thispersondoesnotexist.com/`
- Names are sampled from Khmer Dictionary
|
CyberHarem/unzen_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of unzen/雲仙/云仙 (Azur Lane)
This is the dataset of unzen/雲仙/云仙 (Azur Lane), containing 125 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, white_hair, purple_eyes, hair_over_one_eye, multicolored_hair, bangs, streaked_hair, very_long_hair, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 273.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 125 | 123.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 341 | 280.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 125 | 226.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 341 | 446.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/unzen_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/unzen_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, purple_bikini, navel, thighs, blush, purple_hair, simple_background, white_background, smile, purple_choker, collarbone, highleg_bikini, o-ring_bikini, bare_shoulders |
| 1 | 5 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, one_eye_covered, solo, blue_butterfly, japanese_clothes, white_panties, wide_sleeves, animal_ears, blush, thighs |
| 2 | 27 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, wide_sleeves, one_eye_covered, holding_sword, katana, thigh_strap, dress, japanese_clothes, thighs, sheath, blue_butterfly, between_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | solo | purple_bikini | navel | thighs | blush | purple_hair | simple_background | white_background | smile | purple_choker | collarbone | highleg_bikini | o-ring_bikini | bare_shoulders | one_eye_covered | blue_butterfly | japanese_clothes | white_panties | wide_sleeves | animal_ears | holding_sword | katana | thigh_strap | dress | sheath | between_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:----------------|:--------|:---------|:--------|:--------------|:--------------------|:-------------------|:--------|:----------------|:-------------|:-----------------|:----------------|:-----------------|:------------------|:-----------------|:-------------------|:----------------|:---------------|:--------------|:----------------|:---------|:--------------|:--------|:---------|:------------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | | X | X | | | | | | | | | | X | X | X | X | X | X | | | | | | |
| 2 | 27 |  |  |  |  |  | X | X | X | X | | | X | | | | | | | | | | | X | X | X | | X | | X | X | X | X | X | X |
|
joey234/mmlu-moral_scenarios-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 392005
num_examples: 895
download_size: 94736
dataset_size: 392005
---
# Dataset Card for "mmlu-moral_scenarios-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umm-maybe/story_dataset | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 9490005
num_examples: 6168
- name: test
num_bytes: 1052968
num_examples: 686
download_size: 6360953
dataset_size: 10542973
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
unaidedelf87777/ultrachat-no-refusals | ---
dataset_info:
features:
- name: index
dtype: int64
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: refusal
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1352002990
num_examples: 199920
download_size: 709657565
dataset_size: 1352002990
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Pablao0948/Kingo | ---
license: openrail
---
|
anishgillella/mini-platypus | ---
dataset_info:
features:
- name: train
struct:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4201526
num_examples: 1000
download_size: 2247200
dataset_size: 4201526
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
r4ku/sky | ---
license: mit
---
|
varaslaw/RVC_V2_Kaggle_Ver_3.1 | ---
license: openrail
---
|
TheHenk/mydata | ---
license: mit
---
|
huggingnft/trippytoadznft | ---
tags:
- huggingnft
- nft
- huggan
- gan
- image
- images
task:
- unconditional-image-generation
datasets:
- huggingnft/trippytoadznft
license: mit
---
# Dataset Card
## Disclaimer
All rights belong to their owners.
Models and datasets can be removed from the site at the request of the copyright holder.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingnft](https://github.com/AlekseyKorshuk/huggingnft)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingnft](https://github.com/AlekseyKorshuk/huggingnft)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
NFT images dataset for unconditional generation.
NFT collection available [here](https://opensea.io/collection/trippytoadznft).
Model is available [here](https://huggingface.co/huggingnft/trippytoadznft).
Check Space: [link](https://huggingface.co/spaces/AlekseyKorshuk/huggingnft).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingnft/trippytoadznft")
```
## Dataset Structure
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
The data fields are the same among all splits.
- `image`: an `image` feature.
- `id`: an `int` feature.
- `token_metadata`: a `str` feature.
- `image_original_url`: a `str` feature.
### Data Splits
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingnft,
author={Aleksey Korshuk}
year=2022
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingnft)
|
reciprocate/pku_better_dpo_pairs | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 288521696
num_examples: 297394
- name: test
num_bytes: 31697426
num_examples: 33044
download_size: 184394435
dataset_size: 320219122
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
composite/pauq | ---
dataset_info:
- config_name: ru_os
features:
- name: id
dtype: string
- name: db_id
dtype: string
- name: source
dtype: string
- name: type
dtype: string
- name: question
dtype: string
- name: query
dtype: string
- name: sql
sequence: string
- name: question_toks
sequence: string
- name: query_toks
sequence: string
- name: query_toks_no_values
sequence: string
- name: template
dtype: string
splits:
- name: train
num_examples: 8800
- name: test
num_examples: 1074
- config_name: en_os
features:
- name: id
dtype: string
- name: db_id
dtype: string
- name: source
dtype: string
- name: type
dtype: string
- name: question
dtype: string
- name: query
dtype: string
- name: sql
sequence: string
- name: question_toks
sequence: string
- name: query_toks
sequence: string
- name: query_toks_no_values
sequence: string
- name: template
dtype: string
splits:
- name: train
num_examples: 8800
- name: test
num_examples: 1076
- config_name: ru_trl
features:
- name: id
dtype: string
- name: db_id
dtype: string
- name: source
dtype: string
- name: type
dtype: string
- name: question
dtype: string
- name: query
dtype: string
- name: sql
sequence: string
- name: question_toks
sequence: string
- name: query_toks
sequence: string
- name: query_toks_no_values
sequence: string
- name: template
dtype: string
splits:
- name: train
num_examples: 7890
- name: test
num_examples: 1971
- config_name: en_trl
features:
- name: id
dtype: string
- name: db_id
dtype: string
- name: source
dtype: string
- name: type
dtype: string
- name: question
dtype: string
- name: query
dtype: string
- name: sql
sequence: string
- name: question_toks
sequence: string
- name: query_toks
sequence: string
- name: query_toks_no_values
sequence: string
- name: template
dtype: string
splits:
- name: train
num_examples: 7890
- name: test
num_examples: 1974
- config_name: ru_tsl
features:
- name: id
dtype: string
- name: db_id
dtype: string
- name: source
dtype: string
- name: type
dtype: string
- name: question
dtype: string
- name: query
dtype: string
- name: sql
sequence: string
- name: question_toks
sequence: string
- name: query_toks
sequence: string
- name: query_toks_no_values
sequence: string
- name: template
dtype: string
splits:
- name: train
num_examples: 7900
- name: test
num_examples: 1969
- config_name: en_tsl
features:
- name: id
dtype: string
- name: db_id
dtype: string
- name: source
dtype: string
- name: type
dtype: string
- name: question
dtype: string
- name: query
dtype: string
- name: sql
sequence: string
- name: question_toks
sequence: string
- name: query_toks
sequence: string
- name: query_toks_no_values
sequence: string
- name: template
dtype: string
splits:
- name: train
num_examples: 7900
- name: test
num_examples: 1974
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
Link to databases: https://drive.google.com/file/d/1Xjbp207zfCaBxhPgt-STB_RxwNo2TIW2/view
### Dataset Summary
The Russian version of the [Spider](https://yale-lily.github.io/spider) - Yale Semantic Parsing and Text-to-SQL Dataset.
Major changings:
- Adding (not replacing) new Russian language values in DB tables. Table and DB names remain the original.
- Localization of natural language questions into Russian. All DB values replaced by new.
- Changing in SQL-queries filters.
- Filling empty table with values.
- Complementing the dataset with the new samples of underrepresented types.
### Languages
Russian
## Dataset Creation
### Curation Rationale
The translation from English to Russian is undertaken by a professional human translator with SQL-competence. A verification of the translated questions and their conformity with the queries, and an updating of the databases are undertaken by 4 computer science students.
Details are in the [section 3](https://aclanthology.org/2022.findings-emnlp.175.pdf).
## Additional Information
### Licensing Information
The presented dataset have been collected in a manner which is consistent with the terms of use of the original Spider, which is distributed under the CC BY-SA 4.0 license.
### Citation Information
[Paper link](https://aclanthology.org/2022.findings-emnlp.175.pdf)
```
@inproceedings{bakshandaeva-etal-2022-pauq,
title = "{PAUQ}: Text-to-{SQL} in {R}ussian",
author = "Bakshandaeva, Daria and
Somov, Oleg and
Dmitrieva, Ekaterina and
Davydova, Vera and
Tutubalina, Elena",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2022",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.findings-emnlp.175",
pages = "2355--2376",
abstract = "Semantic parsing is an important task that allows to democratize human-computer interaction. One of the most popular text-to-SQL datasets with complex and diverse natural language (NL) questions and SQL queries is Spider. We construct and complement a Spider dataset for Russian, thus creating the first publicly available text-to-SQL dataset for this language. While examining its components - NL questions, SQL queries and databases content - we identify limitations of the existing database structure, fill out missing values for tables and add new requests for underrepresented categories. We select thirty functional test sets with different features that can be used for the evaluation of neural models{'} abilities. To conduct the experiments, we adapt baseline architectures RAT-SQL and BRIDGE and provide in-depth query component analysis. On the target language, both models demonstrate strong results with monolingual training and improved accuracy in multilingual scenario. In this paper, we also study trade-offs between machine-translated and manually-created NL queries. At present, Russian text-to-SQL is lacking in datasets as well as trained models, and we view this work as an important step towards filling this gap.",
}
```
### Contributions
Thanks to [@gugutse](https://github.com/Gugutse), [@runnerup96](https://github.com/runnerup96), [@dmi3eva](https://github.com/dmi3eva), [@veradavydova](https://github.com/VeraDavydova), [@tutubalinaev](https://github.com/tutubalinaev) for adding this dataset. |
aquillesdaamizade/adelaide-chang | ---
license: openrail
---
|
chitradrishti/reddit | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: index
dtype: string
- name: os_type
dtype: string
- name: wallpaper_dir
dtype: string
- name: title
dtype: string
- name: url
dtype: string
- name: subreddit
dtype: string
- name: path
dtype: string
splits:
- name: train
num_bytes: 55173987511.3
num_examples: 62589
download_size: 62017960249
dataset_size: 55173987511.3
---
# Dataset Card for "reddit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-ce0087-41465145072 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: google/roberta2roberta_L-24_cnn_daily_mail
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/roberta2roberta_L-24_cnn_daily_mail
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Riux](https://huggingface.co/Riux) for evaluating this model. |
open-llm-leaderboard/details_cyberagent__open-calm-7b | ---
pretty_name: Evaluation run of cyberagent/open-calm-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cyberagent/open-calm-7b](https://huggingface.co/cyberagent/open-calm-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cyberagent__open-calm-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T05:13:41.363810](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__open-calm-7b/blob/main/results_2023-09-18T05-13-41.363810.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.009123322147651007,\n\
\ \"em_stderr\": 0.000973701770554162,\n \"f1\": 0.039316275167785276,\n\
\ \"f1_stderr\": 0.0014587233446804973,\n \"acc\": 0.24383651483119942,\n\
\ \"acc_stderr\": 0.00767932509907164\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.009123322147651007,\n \"em_stderr\": 0.000973701770554162,\n\
\ \"f1\": 0.039316275167785276,\n \"f1_stderr\": 0.0014587233446804973\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674368\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.48539857932123126,\n \"acc_stderr\": 0.014046492383275842\n\
\ }\n}\n```"
repo_url: https://huggingface.co/cyberagent/open-calm-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T05_13_41.363810
path:
- '**/details_harness|drop|3_2023-09-18T05-13-41.363810.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T05-13-41.363810.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T05_13_41.363810
path:
- '**/details_harness|gsm8k|5_2023-09-18T05-13-41.363810.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T05-13-41.363810.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:51:08.421995.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:51:08.421995.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:51:08.421995.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T05_13_41.363810
path:
- '**/details_harness|winogrande|5_2023-09-18T05-13-41.363810.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T05-13-41.363810.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_51_08.421995
path:
- results_2023-07-19T16:51:08.421995.parquet
- split: 2023_09_18T05_13_41.363810
path:
- results_2023-09-18T05-13-41.363810.parquet
- split: latest
path:
- results_2023-09-18T05-13-41.363810.parquet
---
# Dataset Card for Evaluation run of cyberagent/open-calm-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cyberagent/open-calm-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cyberagent/open-calm-7b](https://huggingface.co/cyberagent/open-calm-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cyberagent__open-calm-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T05:13:41.363810](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__open-calm-7b/blob/main/results_2023-09-18T05-13-41.363810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.009123322147651007,
"em_stderr": 0.000973701770554162,
"f1": 0.039316275167785276,
"f1_stderr": 0.0014587233446804973,
"acc": 0.24383651483119942,
"acc_stderr": 0.00767932509907164
},
"harness|drop|3": {
"em": 0.009123322147651007,
"em_stderr": 0.000973701770554162,
"f1": 0.039316275167785276,
"f1_stderr": 0.0014587233446804973
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674368
},
"harness|winogrande|5": {
"acc": 0.48539857932123126,
"acc_stderr": 0.014046492383275842
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-source-metrics/preprocessed_pip | ---
dataset_info:
features:
- name: accelerate
dtype: int64
- name: datasets
dtype: int64
- name: diffusers
dtype: int64
- name: evaluate
dtype: int64
- name: gradio
dtype: int64
- name: huggingface_hub
dtype: int64
- name: optimum
dtype: int64
- name: peft
dtype: int64
- name: pytorch_image_models
dtype: int64
- name: safetensors
dtype: int64
- name: tokenizers
dtype: int64
- name: transformers
dtype: int64
- name: sentence_transformers
dtype: int64
- name: langchain
dtype: int64
- name: pytorch
dtype: int64
- name: tensorflow
dtype: int64
- name: openai
dtype: int64
- name: day
dtype: string
splits:
- name: raw
num_bytes: 254550
num_examples: 1676
- name: wow
num_bytes: 36300
num_examples: 239
download_size: 143700
dataset_size: 290850
configs:
- config_name: default
data_files:
- split: raw
path: data/raw-*
- split: wow
path: data/wow-*
---
# Dataset Card for "preprocessed_pip"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-emotion-700553d6-10835457 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: lewiswatson/distilbert-base-uncased-finetuned-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: lewiswatson/distilbert-base-uncased-finetuned-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewiswatson](https://huggingface.co/lewiswatson) for evaluating this model. |
CyberHarem/kagari_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kagari/カガリ (Pokémon)
This is the dataset of kagari/カガリ (Pokémon), containing 482 images and their tags.
The core tags of this character are `purple_hair, short_hair, purple_eyes, horns, fake_horns, breasts, horned_headwear, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 482 | 484.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 482 | 300.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1047 | 589.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 482 | 441.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1047 | 800.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kagari_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, gloves, hoodie, looking_at_viewer, ribbed_sweater, solo, uniform, blush |
| 1 | 11 |  |  |  |  |  | 1girl, gloves, holding_poke_ball, hood_up, looking_at_viewer, solo, eyelashes, poke_ball_(basic), ribbed_dress, closed_mouth, red_dress, sweater_dress, hand_up, ribbed_sweater, white_background, boots, red_footwear |
| 2 | 16 |  |  |  |  |  | 1girl, gloves, hood_up, looking_at_viewer, eyelashes, solo, ribbed_sweater, closed_mouth, ribbed_dress, blush, upper_body, smile, sweater_dress |
| 3 | 5 |  |  |  |  |  | 1girl, boots, gloves, pokemon_(creature), hood_up, hoodie, red_footwear, ribbed_sweater, sweater_dress, closed_mouth, full_body, looking_at_viewer, ribbed_dress, smile, standing, eyelashes, red_dress |
| 4 | 20 |  |  |  |  |  | 1boy, 1girl, blush, hetero, ribbed_sweater, solo_focus, vaginal, hoodie, pussy, gloves, penis, clothed_sex, open_mouth, large_breasts, spread_legs, uniform, bar_censor, cowgirl_position, girl_on_top, lying, nipples |
| 5 | 5 |  |  |  |  |  | 1girl, blush, closed_mouth, large_breasts, long_sleeves, nipples, open_clothes, solo, hood_up, jacket, looking_at_viewer, ass_visible_through_thighs, cameltoe, simple_background, black_panties, black_thighhighs, breasts_out, cowboy_shot, gloves, heavy_breathing, hoodie, leotard, navel, piercing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gloves | hoodie | looking_at_viewer | ribbed_sweater | solo | uniform | blush | holding_poke_ball | hood_up | eyelashes | poke_ball_(basic) | ribbed_dress | closed_mouth | red_dress | sweater_dress | hand_up | white_background | boots | red_footwear | upper_body | smile | pokemon_(creature) | full_body | standing | 1boy | hetero | solo_focus | vaginal | pussy | penis | clothed_sex | open_mouth | large_breasts | spread_legs | bar_censor | cowgirl_position | girl_on_top | lying | nipples | long_sleeves | open_clothes | jacket | ass_visible_through_thighs | cameltoe | simple_background | black_panties | black_thighhighs | breasts_out | cowboy_shot | heavy_breathing | leotard | navel | piercing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------|:--------------------|:-----------------|:-------|:----------|:--------|:--------------------|:----------|:------------|:--------------------|:---------------|:---------------|:------------|:----------------|:----------|:-------------------|:--------|:---------------|:-------------|:--------|:---------------------|:------------|:-----------|:-------|:---------|:-------------|:----------|:--------|:--------|:--------------|:-------------|:----------------|:--------------|:-------------|:-------------------|:--------------|:--------|:----------|:---------------|:---------------|:---------|:-----------------------------|:-----------|:--------------------|:----------------|:-------------------|:--------------|:--------------|:------------------|:----------|:--------|:-----------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 16 |  |  |  |  |  | X | X | | X | X | X | | X | | X | X | | X | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | | | | | X | X | | X | X | X | X | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 20 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | | X | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Archeane/tldr | ---
dataset_info:
features:
- name: Summary
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 20981509.697089948
num_examples: 6123
- name: test
num_bytes: 1168494.987213404
num_examples: 341
- name: valid
num_bytes: 1165068.315696649
num_examples: 340
download_size: 14342947
dataset_size: 23315073.0
---
# Dataset Card for "tldr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChumsAi/RocheTraindata1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9194
num_examples: 19
download_size: 8006
dataset_size: 9194
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.