datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
tyzhu/squad_qa_wrong_title_v5_full_random_permute_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10150707.748609567
num_examples: 6305
- name: validation
num_bytes: 361864
num_examples: 300
download_size: 1521936
dataset_size: 10512571.748609567
---
# Dataset Card for "squad_qa_wrong_title_v5_full_random_permute_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/tl | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2915260
num_examples: 1030
download_size: 1697269
dataset_size: 2915260
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ch08931/Tiago | ---
license: openrail
---
|
alexshengzhili/llava-graph-caption2mentioned-vis | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': test
'1': train
'2': val
splits:
- name: train
num_bytes: 13973907732.925
num_examples: 394005
- name: validation
num_bytes: 391841344.372
num_examples: 10468
- name: test
num_bytes: 395140154.152
num_examples: 10336
download_size: 18302954238
dataset_size: 14760889231.449
---
|
d0rj/audiocaps-ru | ---
dataset_info:
features:
- name: audiocap_id
dtype: int64
- name: youtube_id
dtype: string
- name: start_time
dtype: int64
- name: caption
dtype: string
splits:
- name: train
num_bytes: 6362503.0
num_examples: 49838
- name: validation
num_bytes: 306375.0
num_examples: 2475
- name: test
num_bytes: 714432.0
num_examples: 4875
download_size: 3704490
dataset_size: 7383310.0
license: mit
task_categories:
- text-to-speech
language:
- ru
multilinguality:
- monolingual
tags:
- youtube
- captions
pretty_name: AudioCaps (ru)
size_categories:
- 10K<n<100K
source_datasets:
- d0rj/audiocaps
language_creators:
- translated
---
# audiocaps-ru
Translated version of [d0rj/audiocaps](https://huggingface.co/datasets/d0rj/audiocaps) into Russian. |
smfreeze/mr-collin-hegarty-maths | ---
license: openrail
---
Sub to Tubular Pickaxe |
DatasetingBR/RafaLucas | ---
license: openrail
---
|
antahia/bgb_data | ---
configs:
- config_name: default
data_files:
- split: train
path: "train/*.txt"
- split: test
path: "test/*.txt"
- split: valid
path: "valid/*.txt"
--- |
micsell/hebrew_speech_campus_nikud | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 58404569136
num_examples: 60739
- name: test
num_bytes: 14602076832
num_examples: 15185
download_size: 19094723507
dataset_size: 73006645968
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
irds/istella22_test_fold5 | ---
pretty_name: '`istella22/test/fold5`'
viewer: false
source_datasets: ['irds/istella22']
task_categories:
- text-retrieval
---
# Dataset Card for `istella22/test/fold5`
The `istella22/test/fold5` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/istella22#istella22/test/fold5).
# Data
This dataset provides:
- `queries` (i.e., topics); count=439
- `qrels`: (relevance assessments); count=2,094
- For `docs`, use [`irds/istella22`](https://huggingface.co/datasets/irds/istella22)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/istella22_test_fold5', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/istella22_test_fold5', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
malaysia-ai/filtered-aya-dataset-zsm | ---
task_categories:
- question-answering
language:
- ms
---
# Filtered CohereForAI/aya_dataset on zsm language
Originally from https://huggingface.co/datasets/CohereForAI/aya_dataset, filter rows on `zsm` language only. |
WaeliFatima/translate_dataset_3type | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: tranlated_type_Sentence
num_bytes: 1322498
num_examples: 3000
- name: tranlated_type_Word
num_bytes: 1546617
num_examples: 3259
- name: tranlated_type_Span
num_bytes: 1427130
num_examples: 3000
download_size: 2123982
dataset_size: 4296245
configs:
- config_name: default
data_files:
- split: tranlated_type_Sentence
path: data/tranlated_type_Sentence-*
- split: tranlated_type_Word
path: data/tranlated_type_Word-*
- split: tranlated_type_Span
path: data/tranlated_type_Span-*
---
|
theblackcat102/crossvalidated-posts | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Id
dtype: string
- name: PostTypeId
dtype: string
- name: AcceptedAnswerId
dtype: string
- name: ParentId
dtype: string
- name: Score
dtype: string
- name: ViewCount
dtype: string
- name: Body
dtype: string
- name: Title
dtype: string
- name: ContentLicense
dtype: string
- name: FavoriteCount
dtype: string
- name: CreationDate
dtype: string
- name: LastActivityDate
dtype: string
- name: LastEditDate
dtype: string
- name: LastEditorUserId
dtype: string
- name: OwnerUserId
dtype: string
- name: Tags
sequence: string
splits:
- name: train
num_bytes: 566804417
num_examples: 411232
download_size: 311064786
dataset_size: 566804417
language:
- code
- en
task_categories:
- question-answering
- text-generation
- text2text-generation
tags:
- code
---
# Cross Validated / stats.stackexchange.com
## Dataset Summary
This dataset contains all posts submitted to stats.stackexchange.com before the 30th of August 2023 formatted as **Markdown text**.<br>
The data is sourced from [Internet Archive StackExchange Data Dump](https://archive.org/download/stackexchange) and follows the format by [mikex86/stackoverflow-posts](https://huggingface.co/datasets/mikex86/stackoverflow-posts)
## Dataset Structure
Each record corresponds to one post of a particular type.
Original ordering from the data dump is not exactly preserved due to parallelism in the script used to process the data dump.
The markdown content of each post is contained in the `Body` field. The license for a particular post is contained in the `ContentLicense` field.
### Data Fields
```typescript
{
Id: long,
PostTypeId: long, // 1=Question, 2=Answer, 3=Orphaned tag wiki, 4=Tag wiki excerpt, 5=Tag wiki, 6=Moderator nomination, 7=Wiki Placeholder, 8=Privilige Wiki
AcceptedAnswerId: long | null, // only present if PostTypeId=1
ParentId: long | null, // only present if PostTypeId=2
Score: long,
ViewCount: long | null,
Body: string | null,
Title: string | null,
ContentLicense: string | null,
FavoriteCount: long | null,
CreationDate: string | null,
LastActivityDate: string | null,
LastEditDate: string | null,
LastEditorUserId: long | null,
OwnerUserId: long | null,
Tags: array<string> | null
}
```
Also consider the [StackExchange Datadump Schema Documentation](https://meta.stackexchange.com/questions/2677/database-schema-documentation-for-the-public-data-dump-and-sede), as all fields
have analogs in the original dump format.
## How to use?
```python
from datasets import load_dataset
# predownload full dataset
ds = load_dataset('theblackcat102/crossvalidated-posts', split='train')
# dataset streaming (will only download the data as needed)
ds = load_dataset('theblackcat102/crossvalidated-posts', split='train', streaming=True)
for sample in iter(ds): print(sample["Body"])
```
## How is the text stored?
The original Data Dump formats the "Body" field as HTML, using tags such as `<code>`, `<h1>`, `<ul>`, etc.
This HTML format has been converted to Markdown following [mikex86/stackoverflow-posts](https://huggingface.co/datasets/mikex86/stackoverflow-posts) conversion rule.
**Example:**
After differencing I saw that my constant/intercept is not statistically significant. Does anybody know how to fit the same model without the const term?
im using statsmodels.tsa.arima.model
To give a relative example I have: `ARIMA(data, order=(3,0,0))` an AR(3) model and say it that the second coefficient is insignificant. I can get rid of it by typing
```
ARMA(data,order=([1, 3], 0, 0)
```
but how can I get rid of coefficient??
|
irds/msmarco-passage_train_triples-small | ---
pretty_name: '`msmarco-passage/train/triples-small`'
viewer: false
source_datasets: ['irds/msmarco-passage']
task_categories:
- text-retrieval
---
# Dataset Card for `msmarco-passage/train/triples-small`
The `msmarco-passage/train/triples-small` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/msmarco-passage#msmarco-passage/train/triples-small).
# Data
This dataset provides:
- `docpairs`; count=39,780,811
- For `docs`, use [`irds/msmarco-passage`](https://huggingface.co/datasets/irds/msmarco-passage)
## Usage
```python
from datasets import load_dataset
docpairs = load_dataset('irds/msmarco-passage_train_triples-small', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Bajaj2016Msmarco,
title={MS MARCO: A Human Generated MAchine Reading COmprehension Dataset},
author={Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, Mir Rosenberg, Xia Song, Alina Stoica, Saurabh Tiwary, Tong Wang},
booktitle={InCoCo@NIPS},
year={2016}
}
```
|
anlp/anno1_w_elimination | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: sentences
sequence: string
- name: ner_tags
sequence: string
splits:
- name: train
num_bytes: 1239484
num_examples: 917
download_size: 249472
dataset_size: 1239484
---
# Dataset Card for "anno1_w_elimination"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlexWortega/InstructDiffusion | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: file_name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1918299
num_examples: 4060
download_size: 833258
dataset_size: 1918299
---
|
VXDAW/adad | ---
license: unknown
---
|
GunA-SD/DataX | ---
license: apache-2.0
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Topic
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 5397321128
num_examples: 1720117
download_size: 3148810475
dataset_size: 5397321128
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-generation
- summarization
- question-answering
language:
- en
size_categories:
- 1M<n<10M
---
<center>
<p>
<img src="./image.jpg" width="40%" height="40%">
</p>
</center>
## Description
The "DataX" dataset is a curated collection combining data generated by large language models (LLMs) and information scraped from Wikipedia.
It spans a vast array of topics, providing a rich resource for tasks such as text generation, text-to-text generation, summarization, and conversational models.
With over 1.7 million examples, it stands as a significant asset for training robust and diverse machine learning and deep learning models.
## Completeness and Future Work
While the dataset currently offers a substantial volume of data, efforts are ongoing to expand its scope and utility.
Future updates may include additional splits for validation and testing, broader topic coverage, and enhanced metadata for even richer model training possibilities.
### Intended Use
The "datax" dataset is intended for use in academic research and practical applications within the fields of natural language processing (NLP) and machine learning (ML).
It is particularly suited for training and evaluating models on a wide range of tasks. Researchers and developers are encouraged to utilize this dataset to explore innovative
NLP techniques and to benchmark the performance of models in a variety of contexts.
### Limitations
This dataset, while extensive, represents a snapshot of information available up to the year 2023. Users should be aware of the dataset's temporal context when applying it to contemporary models and research.
Furthermore, the dataset's language coverage is currently limited to English, which may restrict its applicability for multilingual or non-English projects.
### Ethical Considerations
The compilation of this dataset involved collecting data generated by LLMs and scraping content from Wikipedia. While every effort has been made to ensure the dataset adheres to ethical guidelines and respects copyright laws,
users are advised to consider the potential for bias and the representation of diverse perspectives within the data. Additionally, users should evaluate the dataset's appropriateness for their specific research or application needs,
particularly in sensitive or regulated domains.
## Usage
You can use this dataset by loading it using the Hugging Face datasets library or any other relevant method.
#### Example Usage
```python
from datasets import load_dataset
# Load the dataset
data = load_dataset('GunA-SD/DataX')
```
## Citation:
Please cite this dataset in your publications if it helps your research:
```
@misc{DataX,
title = {DataX: A Mixture of LLM Generated and Wiki Scraped Data},
author = {Gunasekar},
year = {2023},
publisher = {HuggingFace},
url = {https://huggingface.co/datasets/GunA-SD/DataX}
}
```
## License
This dataset is distributed under the Apache-2.0 License. Full license text is available at [LICENSE](https://apache.org/licenses/LICENSE-2.0).
|
tiagoblima/punctuation-mec-bert | ---
dataset_info:
features:
- name: tag
dtype: string
- name: sent_id
dtype: int64
- name: text_id
dtype: int64
- name: sent_text
dtype: string
- name: tokens
sequence: string
- name: labels
sequence: string
splits:
- name: train
num_bytes: 1075373
num_examples: 2168
download_size: 313037
dataset_size: 1075373
---
# Dataset Card for "mec-punctuation-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
minathor/132 | ---
license: openrail
---
|
Domingos33/Filmes | ---
license: openrail
---
|
CyberHarem/i_400_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of i_400 (Kantai Collection)
This is the dataset of i_400 (Kantai Collection), containing 130 images and their tags.
The core tags of this character are `long_hair, black_hair, headgear, bangs, black_eyes, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 130 | 102.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_400_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 130 | 66.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_400_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 279 | 137.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_400_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 130 | 94.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_400_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 279 | 181.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_400_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/i_400_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blue_one-piece_swimsuit, orange_sailor_collar, sailor_shirt, school_swimsuit, sleeveless_shirt, solo, swimsuit_under_clothes, white_shirt, bare_arms, looking_at_viewer, open_mouth, smile, white_background, black_one-piece_swimsuit, simple_background, cowboy_shot, standing, tanlines, teeth |
| 1 | 6 |  |  |  |  |  | 1girl, detached_collar, fake_animal_ears, playboy_bunny, rabbit_ears, solo, strapless_leotard, wrist_cuffs, black_leotard, looking_at_viewer, one-piece_tan, open_mouth, alternate_costume, blush, cowboy_shot, rabbit_tail, simple_background, smile, white_background, bare_legs, blue_leotard, bowtie, dated, small_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_one-piece_swimsuit | orange_sailor_collar | sailor_shirt | school_swimsuit | sleeveless_shirt | solo | swimsuit_under_clothes | white_shirt | bare_arms | looking_at_viewer | open_mouth | smile | white_background | black_one-piece_swimsuit | simple_background | cowboy_shot | standing | tanlines | teeth | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | wrist_cuffs | black_leotard | one-piece_tan | alternate_costume | blush | rabbit_tail | bare_legs | blue_leotard | bowtie | dated | small_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------------|:-----------------------|:---------------|:------------------|:-------------------|:-------|:-------------------------|:--------------|:------------|:--------------------|:-------------|:--------|:-------------------|:---------------------------|:--------------------|:--------------|:-----------|:-----------|:--------|:------------------|:-------------------|:----------------|:--------------|:--------------------|:--------------|:----------------|:----------------|:--------------------|:--------|:--------------|:------------|:---------------|:---------|:--------|:----------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | | | | X | | | | X | X | X | X | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
yixuantt/MultiHopRAG | ---
license: odc-by
task_categories:
- question-answering
- feature-extraction
language:
- en
pretty_name: MultiHop-RAG
size_categories:
- 1K<n<10K
configs:
- config_name: MultiHopRAG
data_files: "MultiHopRAG.json"
- config_name: corpus
data_files: "corpus.json"
---
# Dataset Card for Dataset Name
A Dataset for Evaluating Retrieval-Augmented Generation Across Documents
### Dataset Description
**MultiHop-RAG**: a QA dataset to evaluate retrieval and reasoning across documents with metadata in the RAG pipelines. It contains 2556 queries, with evidence for each query distributed across 2 to 4 documents. The queries also involve document metadata, reflecting complex scenarios commonly found in real-world RAG applications.
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Github:** [MultiHop-RAG](https://github.com/yixuantt/MultiHop-RAG)
- **Paper:** [MultiHop-RAG: Benchmarking Retrieval-Augmented Generation for Multi-Hop Queries](https://arxiv.org/abs/2401.15391)
## Citation
**BibTeX:**
```
@misc{tang2024multihoprag,
title={MultiHop-RAG: Benchmarking Retrieval-Augmented Generation for Multi-Hop Queries},
author={Yixuan Tang and Yi Yang},
year={2024},
eprint={2401.15391},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
brusic/hacker-news-who-is-hiring-posts | ---
license: mit
---
# Context
This dataset contains all first-level comments to [Hacker News Who Is Hiring posts](https://news.ycombinator.com/submitted?id=whoishiring) from April 2011 to March 2024 in a pickle format. All data is derived from the [official Firebase API](https://github.com/HackerNews/API) and no data cleansing has occurred.
*Who wants to be hired?* and *Freelancer* posts are not included.
# Content
Each row contains the content for a single month which includes:
- **month**: the month in mmmm yyyy format)
- **parent_id**: the submission id
- **comments**: list of comments for the given month.
A single row for each month so that the data can be easily appended to with new data. [Threads are closed to new comments after two weeks](https://news.ycombinator.com/newsfaq.html) so a new row can be appended to the start after the middle of the current month.
| | **month** | **parent_id** | **comments** |
| ------- | -------------- | ------------- | ------------------------------------------------- |
| **0** | March 2024 | 39562986 | [{'id': 39563104, 'by': 'jnathsf', 'text': 'Ci... |
| **1** | February 2024 | 39217310 | [{'id': 39375047, 'by': 'lpnoel1', 'text': 'Di... |
| **2** | January 2024 | 38842977 | [{'id': 38844766, 'by': 'pudo', 'text': 'OpenS... |
| **...** | ... | ... | ... |
| **159** | June 2011 | 2607052 | [{'id': 2607280, 'by': 'yummyfajitas', 'text':... |
| **160** | May 2011 | 2503204 | [{'id': 2504067, 'by': 'jfarmer', 'text': 'Eve... |
| **161** | April 2011 | 2396027 | [{'id': 2396144, 'by': 'pchristensen', 'text':... |
### The data frame can be easily converted into a row-for-comment format
```python
import pandas as pd
hiring_df = pd.read_pickle('hiring_march_2024.pck')
exploded_df = hiring_df.explode('comments').dropna().reset_index(drop=True).rename(columns={'comments': 'comment'})
comments_df = exploded_df.join(pd.DataFrame(exploded_df['comment'].tolist())).drop('comment', axis=1)
```
| | **month** | **parent_id** | **id** | **by** | **text** |
| ----- | ---------- | ------------- | -------- | --------------- | ------------------------------------------------- |
| **0** | March 2024 | 39562986 | 39563104 | jnathsf | City Innovate |
|
csac/dw | ---
license: other
license_name: wda
license_link: LICENSE
---
|
crich/syndicom | ---
license: wtfpl
task_categories:
- conversational
- text-generation
- text-classification
language:
- en
size_categories:
- 10K<n<100K
--- |
dyllanwli/dataproduct_metadata_tqa | ---
license: apache-2.0
---
|
wheart/aiazuki | ---
license: openrail
---
stable diffusion
Azuki
|
open-llm-leaderboard/details_elinas__chronos007-70b | ---
pretty_name: Evaluation run of elinas/chronos007-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elinas/chronos007-70b](https://huggingface.co/elinas/chronos007-70b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elinas__chronos007-70b_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-08T16:36:09.949809](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos007-70b_public/blob/main/results_2023-11-08T16-36-09.949809.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08756291946308725,\n\
\ \"em_stderr\": 0.002894684468980241,\n \"f1\": 0.1552086828859053,\n\
\ \"f1_stderr\": 0.0030733731115224513,\n \"acc\": 0.6242477589094606,\n\
\ \"acc_stderr\": 0.012180910628722973\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08756291946308725,\n \"em_stderr\": 0.002894684468980241,\n\
\ \"f1\": 0.1552086828859053,\n \"f1_stderr\": 0.0030733731115224513\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42608036391205456,\n \
\ \"acc_stderr\": 0.013621144396086709\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359238\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elinas/chronos007-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_08T16_36_09.949809
path:
- '**/details_harness|drop|3_2023-11-08T16-36-09.949809.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-08T16-36-09.949809.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_08T16_36_09.949809
path:
- '**/details_harness|gsm8k|5_2023-11-08T16-36-09.949809.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-08T16-36-09.949809.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_08T16_36_09.949809
path:
- '**/details_harness|winogrande|5_2023-11-08T16-36-09.949809.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-08T16-36-09.949809.parquet'
- config_name: results
data_files:
- split: 2023_11_08T16_36_09.949809
path:
- results_2023-11-08T16-36-09.949809.parquet
- split: latest
path:
- results_2023-11-08T16-36-09.949809.parquet
---
# Dataset Card for Evaluation run of elinas/chronos007-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elinas/chronos007-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elinas/chronos007-70b](https://huggingface.co/elinas/chronos007-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elinas__chronos007-70b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T16:36:09.949809](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos007-70b_public/blob/main/results_2023-11-08T16-36-09.949809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08756291946308725,
"em_stderr": 0.002894684468980241,
"f1": 0.1552086828859053,
"f1_stderr": 0.0030733731115224513,
"acc": 0.6242477589094606,
"acc_stderr": 0.012180910628722973
},
"harness|drop|3": {
"em": 0.08756291946308725,
"em_stderr": 0.002894684468980241,
"f1": 0.1552086828859053,
"f1_stderr": 0.0030733731115224513
},
"harness|gsm8k|5": {
"acc": 0.42608036391205456,
"acc_stderr": 0.013621144396086709
},
"harness|winogrande|5": {
"acc": 0.8224151539068666,
"acc_stderr": 0.010740676861359238
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jilp00/icdcm-code-desc | ---
dataset_info:
features:
- name: code
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 18063160
num_examples: 226015
download_size: 5168248
dataset_size: 18063160
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mteb/stackoverflowdupquestions-reranking | ---
language:
- en
--- |
mteb/sickr-sts | ---
language:
- en
--- |
yasminemun/testing_instruction_dataset | ---
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: rating
sequence: float64
- name: rationale
sequence: string
splits:
- name: train
num_bytes: 54111
num_examples: 10
download_size: 57447
dataset_size: 54111
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tritika30/dataset | ---
license: mit
---
|
youngermax/sherlock | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: article
dtype: string
- name: infobox
dtype: string
splits:
- name: train
num_bytes: 373301449
num_examples: 27906
download_size: 216489948
dataset_size: 373301449
---
# Dataset Card for "sherlock"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yae_miko_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yae_miko/八重神子/八重神子 (Genshin Impact)
This is the dataset of yae_miko/八重神子/八重神子 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `pink_hair, long_hair, animal_ears, fox_ears, purple_eyes, hair_between_eyes, breasts, earrings, hair_ornament, large_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.35 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yae_miko_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 1.08 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yae_miko_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1370 | 2.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yae_miko_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yae_miko_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, japanese_clothes, jewelry, long_sleeves, looking_at_viewer, smile, solo, white_shirt, wide_sleeves, nontraditional_miko, sideboob, parted_lips, cherry_blossoms, cowboy_shot, sleeveless_shirt, hand_up, red_skirt, petals |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, japanese_clothes, jewelry, looking_at_viewer, nontraditional_miko, parted_lips, smile, solo, upper_body, wide_sleeves, hand_up, long_sleeves, petals, pink_nails |
| 2 | 8 |  |  |  |  |  | 1girl, bare_legs, bare_shoulders, cherry_blossoms, detached_sleeves, floppy_ears, japanese_clothes, jewelry, looking_at_viewer, nontraditional_miko, sideboob, smile, solo, thighs, white_shirt, wide_sleeves, sitting, sleeveless_shirt, feet_out_of_frame, long_sleeves, outdoors, blush, crossed_legs, hand_up, tree, falling_petals, parted_lips, red_skirt, closed_mouth, fox_shadow_puppet, torii |
| 3 | 7 |  |  |  |  |  | 1girl, bare_legs, bare_shoulders, closed_mouth, detached_sleeves, japanese_clothes, jewelry, long_sleeves, looking_at_viewer, nontraditional_miko, okobo, sandals, smile, solo, thighs, toes, white_shirt, wide_sleeves, full_body, red_skirt, medium_breasts, sideboob, sleeveless_shirt, cherry_blossoms, feet, floral_print, low-tied_long_hair |
| 4 | 6 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, blue_sky, blush, cleavage, collarbone, navel, smile, day, jewelry, outdoors, stomach, thighs, white_bikini, beach, closed_mouth, cloud, tongue_out, water, wet |
| 5 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, jewelry, solo, thighs, casual_one-piece_swimsuit, collarbone, covered_navel, floppy_ears, looking_at_viewer, smile, alternate_costume, closed_mouth, cowboy_shot, highleg, standing, wet, white_one-piece_swimsuit |
| 6 | 8 |  |  |  |  |  | 1girl, cleavage, floppy_ears, looking_at_viewer, navel, solo, stomach, thighs, bare_shoulders, blush, long_sleeves, alternate_costume, collarbone, smile, off_shoulder, white_shirt, midriff, black_shorts, closed_mouth, cowboy_shot, crop_top, hand_up, jewelry, open_shirt, short_shorts, sidelocks, simple_background, standing, white_background, white_bra, white_panties |
| 7 | 5 |  |  |  |  |  | 1girl, alternate_costume, black_skirt, looking_at_viewer, solo, floppy_ears, smile, blush, cleavage, collarbone, collared_shirt, contemporary, cowboy_shot, jewelry, office_lady, pencil_skirt, shirt_tucked_in, bare_shoulders, black_pantyhose, closed_mouth, dress_shirt, holding, indoors, long_sleeves, low-tied_long_hair, parted_lips, sidelocks, sitting, thighs, white_shirt, window |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | detached_sleeves | japanese_clothes | jewelry | long_sleeves | looking_at_viewer | smile | solo | white_shirt | wide_sleeves | nontraditional_miko | sideboob | parted_lips | cherry_blossoms | cowboy_shot | sleeveless_shirt | hand_up | red_skirt | petals | upper_body | pink_nails | bare_legs | floppy_ears | thighs | sitting | feet_out_of_frame | outdoors | blush | crossed_legs | tree | falling_petals | closed_mouth | fox_shadow_puppet | torii | okobo | sandals | toes | full_body | medium_breasts | feet | floral_print | low-tied_long_hair | blue_sky | cleavage | collarbone | navel | day | stomach | white_bikini | beach | cloud | tongue_out | water | wet | casual_one-piece_swimsuit | covered_navel | alternate_costume | highleg | standing | white_one-piece_swimsuit | off_shoulder | midriff | black_shorts | crop_top | open_shirt | short_shorts | sidelocks | simple_background | white_background | white_bra | white_panties | black_skirt | collared_shirt | contemporary | office_lady | pencil_skirt | shirt_tucked_in | black_pantyhose | dress_shirt | holding | indoors | window |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------------------|:-------------------|:----------|:---------------|:--------------------|:--------|:-------|:--------------|:---------------|:----------------------|:-----------|:--------------|:------------------|:--------------|:-------------------|:----------|:------------|:---------|:-------------|:-------------|:------------|:--------------|:---------|:----------|:--------------------|:-----------|:--------|:---------------|:-------|:-----------------|:---------------|:--------------------|:--------|:--------|:----------|:-------|:------------|:-----------------|:-------|:---------------|:---------------------|:-----------|:-----------|:-------------|:--------|:------|:----------|:---------------|:--------|:--------|:-------------|:--------|:------|:----------------------------|:----------------|:--------------------|:----------|:-----------|:---------------------------|:---------------|:----------|:---------------|:-----------|:-------------|:---------------|:------------|:--------------------|:-------------------|:------------|:----------------|:--------------|:-----------------|:---------------|:--------------|:---------------|:------------------|:------------------|:--------------|:----------|:----------|:---------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | | X | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | X | | X | | | | X | | X | | | | | | | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | | X | | X | X | X | | | | | | | | | | | | | | | | X | | | X | X | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | | X | | X | X | X | | | | | | | X | | | | | | | | X | X | | | | | | | | X | | | | | | | | | | | | X | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | | | X | X | X | X | X | X | | | | | | X | | X | | | | | | X | X | | | | X | | | | X | | | | | | | | | | | | X | X | X | | X | | | | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | | X | X | X | X | X | X | | | | X | | X | | | | | | | | X | X | X | | | X | | | | X | | | | | | | | | | X | | X | X | | | | | | | | | | | | X | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
bnvsyjy/test_llama | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ygorgeurts/movie-quotes | ---
license: apache-2.0
---
|
Siraitia/deeplearning-catdog | ---
license: unknown
---
|
AiresPucrs/tweets | ---
language:
- en
license: cc
size_categories:
- 10K<n<100K
task_categories:
- text-classification
pretty_name: Tweets
tags:
- toxicity
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1621836
num_examples: 14640
download_size: 894257
dataset_size: 1621836
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Tweets
## Overview
This dataset contains texts from customers posted on Twitter regarding their air travel experiences,
whether they were upset, neutral, or satisfied with the trip and the airline's service.
## Dataset Details
The dataset is a smaller version of the original dataset. This data originally came from [Crowdflower's Data for Everyone library](https://data.world/crowdflower)
The original Twitter data was scraped from February 2015, and contributors were asked first to classify positive, negative, and neutral tweets,
followed by categorizing negative reasons (such as "late flight" or "rude service").
This version contains whether the sentiment of the tweets in this set was positive (16%), neutral (21%), or negative (63%) for six US airlines.
- Dataset Name: [Twitter US Airline Sentiment](https://www.kaggle.com/datasets/crowdflower/twitter-airline-sentiment)
- Language: English
- Total Size: 14,640 demonstrations
## Contents
The dataset consists of a data frame with the following columns:
- label
- text
```bash
{
"label": 0,
"text": "virginamerica why are your first fares in may over three times more than other carriers when all seats are available to select.",
}
```
## How to use
```python
from datasets import load_dataset
dataset = load_dataset("AiresPucrs/tweets", split='train')
```
## License
The Twitter US Airline Sentiment is licensed under the [Creative Commons(CC)](https://creativecommons.org/licenses/by-nc-sa/4.0/) License CC BY-NC-SA 4.0.
|
Ngadou/Spam_SMS | ---
license: cc
---
## Description
The Spam SMS is a set of SMS-tagged messages that have been collected for SMS Spam research. It contains one set of SMS messages in English of 5,574 messages, tagged according to being ham (legitimate) or spam.
Source: [uciml/sms-spam-collection-dataset](https://www.kaggle.com/datasets/uciml/sms-spam-collection-dataset) |
angeliuk/AlpacaCleanedWithDist | ---
license: apache-2.0
---
|
nhantruongcse/data_craw_130k_filter | ---
dataset_info:
features:
- name: Content
dtype: string
- name: Summary
dtype: string
splits:
- name: train
num_bytes: 486683107
num_examples: 129892
download_size: 249010164
dataset_size: 486683107
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
harshit03/supervisedDataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 375302
num_examples: 681
download_size: 194087
dataset_size: 375302
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Max5ive/tendergpt-training-dataset | ---
license: apache-2.0
---
|
irds/wikiclir_ca | ---
pretty_name: '`wikiclir/ca`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikiclir/ca`
The `wikiclir/ca` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/ca).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=548,722
- `queries` (i.e., topics); count=339,586
- `qrels`: (relevance assessments); count=965,233
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikiclir_ca', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ...}
queries = load_dataset('irds/wikiclir_ca', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/wikiclir_ca', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{sasaki-etal-2018-cross,
title = "Cross-Lingual Learning-to-Rank with Shared Representations",
author = "Sasaki, Shota and
Sun, Shuo and
Schamoni, Shigehiko and
Duh, Kevin and
Inui, Kentaro",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2073",
doi = "10.18653/v1/N18-2073",
pages = "458--463"
}
```
|
CyberHarem/irene_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of irene/アイリーニ/艾丽妮 (Arknights)
This is the dataset of irene/アイリーニ/艾丽妮 (Arknights), containing 500 images and their tags.
The core tags of this character are `grey_hair, wings, head_wings, long_hair, grey_eyes, scar_across_eye, earrings, scar_on_face, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/irene_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 470.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irene_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1329 | 1.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/irene_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 877.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irene_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1329 | 1.63 GiB | [Download](https://huggingface.co/datasets/CyberHarem/irene_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/irene_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_gloves, black_jacket, holding_lantern, jewelry, long_sleeves, scar, solo, closed_mouth, cowboy_shot, white_capelet, white_skirt, ammunition_belt, looking_at_viewer, sword, dress, simple_background, black_background |
| 1 | 9 |  |  |  |  |  | 1girl, black_gloves, black_jacket, holding_sword, rapier, scar, solo, white_skirt, holding_lantern, jewelry, looking_at_viewer, puffy_long_sleeves, white_pantyhose, ammunition_belt, closed_mouth, white_capelet, gun, cowboy_shot, dress, sheath |
| 2 | 5 |  |  |  |  |  | 1girl, ammunition_belt, black_footwear, black_gloves, black_jacket, closed_mouth, full_body, holding_lantern, scar, solo, standing, sword, white_capelet, white_pantyhose, white_skirt, ankle_boots, jewelry, looking_at_viewer, puffy_long_sleeves, gun, purple_skirt, dress, sheathed, shoes |
| 3 | 7 |  |  |  |  |  | 1girl, black_gloves, black_jacket, gun, holding_lantern, jewelry, looking_at_viewer, outdoors, purple_skirt, scar, solo, white_skirt, cloudy_sky, rapier, white_capelet, white_pantyhose, ammunition_belt, rain, standing, closed_mouth, cowboy_shot, holding_sword, sheathed, ocean, parted_lips, puffy_long_sleeves |
| 4 | 8 |  |  |  |  |  | 1girl, black_jacket, holding_sword, long_sleeves, looking_at_viewer, rapier, scar, solo, white_skirt, black_gloves, closed_mouth, cowboy_shot, jewelry, simple_background, white_background, white_pantyhose, handgun, holding_gun, ammunition_belt, dress |
| 5 | 8 |  |  |  |  |  | 1girl, black_dress, black_footwear, closed_mouth, juliet_sleeves, maid_headdress, official_alternate_costume, shoes, solo, white_apron, white_pantyhose, black_gloves, full_body, scar, jewelry, looking_at_viewer, maid_apron, frills, standing, holding, ponytail |
| 6 | 7 |  |  |  |  |  | 1girl, black_dress, juliet_sleeves, looking_at_viewer, official_alternate_costume, simple_background, solo, white_apron, white_background, black_gloves, maid_apron, maid_headdress, ponytail, scar, closed_mouth, necklace, sleeve_cuffs |
| 7 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, navel, scar, solo, blush, closed_mouth, collarbone, completely_nude, jewelry, medium_breasts, pussy, nipples, simple_background, stomach, white_background, barefoot |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | black_jacket | holding_lantern | jewelry | long_sleeves | scar | solo | closed_mouth | cowboy_shot | white_capelet | white_skirt | ammunition_belt | looking_at_viewer | sword | dress | simple_background | black_background | holding_sword | rapier | puffy_long_sleeves | white_pantyhose | gun | sheath | black_footwear | full_body | standing | ankle_boots | purple_skirt | sheathed | shoes | outdoors | cloudy_sky | rain | ocean | parted_lips | white_background | handgun | holding_gun | black_dress | juliet_sleeves | maid_headdress | official_alternate_costume | white_apron | maid_apron | frills | holding | ponytail | necklace | sleeve_cuffs | navel | blush | collarbone | completely_nude | medium_breasts | pussy | nipples | stomach | barefoot |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------------|:------------------|:----------|:---------------|:-------|:-------|:---------------|:--------------|:----------------|:--------------|:------------------|:--------------------|:--------|:--------|:--------------------|:-------------------|:----------------|:---------|:---------------------|:------------------|:------|:---------|:-----------------|:------------|:-----------|:--------------|:---------------|:-----------|:--------|:-----------|:-------------|:-------|:--------|:--------------|:-------------------|:----------|:--------------|:--------------|:-----------------|:-----------------|:-----------------------------|:--------------|:-------------|:---------|:----------|:-----------|:-----------|:---------------|:--------|:--------|:-------------|:------------------|:-----------------|:--------|:----------|:----------|:-----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | X | X | X | X | X | X | | | | | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | X | X | X | X | X | | | | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | | X | X | X | | X | X | | X | X | | X | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | | X | | X | X | X | | | | | X | | | | | | | | X | | | X | X | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | X | | | | | X | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | | | X | X | X | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | | X | | X | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
uchuukaizoku/jardines | ---
dataset_info:
features:
- name: file_name
dtype: image
- name: conditioning_file_name
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 100391798.0
num_examples: 163
download_size: 99594135
dataset_size: 100391798.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/VALUE_wikitext2_drop_aux | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: test
num_bytes: 287459
num_examples: 386
- name: train
num_bytes: 2899414
num_examples: 3888
- name: validation
num_bytes: 235138
num_examples: 340
download_size: 2054815
dataset_size: 3422011
---
# Dataset Card for "VALUE_wikitext2_drop_aux"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nguyentruong-ins/codeforces_cpp_cleaned | ---
dataset_info:
features:
- name: solution
dtype: string
- name: difficulty
dtype: int64
splits:
- name: train
num_bytes: 1402541089.9400597
num_examples: 1076270
- name: test
num_bytes: 175317962.02997017
num_examples: 134534
- name: valid
num_bytes: 175317962.02997017
num_examples: 134534
download_size: 736785823
dataset_size: 1753177014.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
AdapterOcean/chemistry_dataset_standardized_cluster_1_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6068325
num_examples: 2750
download_size: 2563564
dataset_size: 6068325
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chemistry_dataset_standardized_cluster_1_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
windwp/my-image | ---
license: agpl-3.0
---
|
harpreetsahota/zero_shot_comparison | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: rationale
dtype: string
- name: task
dtype: string
- name: type
dtype: string
- name: decilm_generation
dtype: string
- name: mistral_generation
dtype: string
splits:
- name: train
num_bytes: 67718
num_examples: 30
download_size: 54407
dataset_size: 67718
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "zero_shot_comparison"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pmpc/processed-old-with-embeddings | ---
dataset_info:
- config_name: default
features:
- name: slug
dtype: string
- name: text_chunk
dtype: string
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 17448677826
num_examples: 3655376
download_size: 14805980593
dataset_size: 17448677826
- config_name: small
features:
- name: slug
dtype: string
- name: text_chunk
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 475656222.6698008
num_examples: 99531
- name: test
num_bytes: 23459991.330199156
num_examples: 4909
download_size: 488406448
dataset_size: 499116214.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: small
data_files:
- split: train
path: small/train-*
- split: test
path: small/test-*
---
# Dataset Card for "processed-old-with-embeddings"
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Chunks of about 256 words split by whitespace and their embeddings computed with the pretrained spacy model ["de_dep_news_trf"] (https://github.com/explosion/spacy-models/releases/tag/de_dep_news_trf-3.6.1).
The splits are created with respect to sentence boundaries parsed with the same model, sentences are concatenated if the result does not exceed max_words = 256, therefore the chunk length varies.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
This dataset contains texts from the legal domain in German language. (German court decisions)
## Dataset Structure
[More Information Needed]
### Data Instances
{'slug': 'ag-pinneberg-2003-12-19-68-ii-9302-weg',
'text_chunk': 'Die Berufung des Klägers gegen das am 23. April 2002 verkündete Urteil der 1. Zivilkammer des Landgerichts Wuppertal wird zurückgewiesen.\n\n Der Kläger trägt (...)',
'embedding': [-0.055155396461486816, -0.3904547095298767, -0.0033536632545292377, 0.8048776984214783, 0.30156993865966797, 0.5924882888793945, (...)]]}
### Data Fields
{
'slug': data['slug'],
'text_chunk': text,
'embedding': embedding
}
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
This dataset contains texts from the legal domain in German language. (German court decisions)
### Citation Information
@inproceedings{10.1145/3383583.3398616,
author = {Ostendorff, Malte and Blume, Till and Ostendorff, Saskia},
title = {Towards an Open Platform for Legal Information},
year = {2020},
isbn = {9781450375856},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3383583.3398616},
doi = {10.1145/3383583.3398616},
booktitle = {Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2020},
pages = {385–388},
numpages = {4},
keywords = {open data, open source, legal information system, legal data},
location = {Virtual Event, China},
series = {JCDL '20}
} |
liuyanchen1015/MULTI_VALUE_qqp_your_you | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 380050
num_examples: 2522
- name: test
num_bytes: 3317067
num_examples: 21436
- name: train
num_bytes: 3373529
num_examples: 22352
download_size: 3991747
dataset_size: 7070646
---
# Dataset Card for "MULTI_VALUE_qqp_your_you"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HarshilPatel1905/train_emotion_spring_2024 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
sequence: float64
splits:
- name: train
num_bytes: 1186430.397980321
num_examples: 6179
- name: valid
num_bytes: 296655.6020196789
num_examples: 1545
download_size: 616357
dataset_size: 1483086.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
|
joshtobin/malicious_urls | ---
dataset_info:
features:
- name: url_len
dtype: int64
- name: abnormal_url
dtype: int64
- name: https
dtype: int64
- name: digits
dtype: int64
- name: letters
dtype: int64
- name: shortening_service
dtype: int64
- name: ip_address
dtype: int64
- name: '@'
dtype: int64
- name: '?'
dtype: int64
- name: '-'
dtype: int64
- name: '='
dtype: int64
- name: .
dtype: int64
- name: '#'
dtype: int64
- name: '%'
dtype: int64
- name: +
dtype: int64
- name: $
dtype: int64
- name: '!'
dtype: int64
- name: '*'
dtype: int64
- name: ','
dtype: int64
- name: //
dtype: int64
splits:
- name: train
num_bytes: 32000
num_examples: 200
download_size: 9837
dataset_size: 32000
---
# Dataset Card for "malicious_urls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eren23/cs_item_embeddings_small | ---
dataset_info:
features:
- name: embeddings
sequence: float64
- name: labels
dtype: string
- name: weapon_type
dtype: string
splits:
- name: train
num_bytes: 12161685
num_examples: 2926
download_size: 3131592
dataset_size: 12161685
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RJuro/neuro_patents_sample_finetune | ---
dataset_info:
features:
- name: appln_id
dtype: int64
- name: appln_filing_date
dtype: string
- name: docdb_family_id
dtype: int64
- name: granted
dtype: string
- name: appln_abstract
dtype: string
- name: appln_abstract_lg
dtype: string
- name: appln_title
dtype: string
- name: applt_coun
dtype: string
- name: invt_coun
dtype: string
- name: cpc
dtype: string
- name: ipc
sequence: string
- name: __index_level_0__
dtype: int64
- name: input
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 254841.9
num_examples: 107
download_size: 155075
dataset_size: 254841.9
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PaulineSanchez/recipes_translation_2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fr
splits:
- name: train
num_bytes: 57430.4
num_examples: 200
- name: validation
num_bytes: 14357.6
num_examples: 50
download_size: 48205
dataset_size: 71788.0
---
# Dataset Card for "recipes_translation_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Squirrl/autotrain-data-petscan | ---
task_categories:
- image-classification
---
# Dataset for project: Pet-Ray
## Dataset Description
This G-Ray dataset has been processed by AutoTrain for Pet-Ray.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<1800x4000 RGB PIL image>",
"target": 0
},
{
"image": "<1800x4000 RGB PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['chubs'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 11 |
| valid | 3 |
|
HiTZ/This-is-not-a-dataset | ---
license: apache-2.0
dataset_info:
features:
- name: pattern_id
dtype: int64
- name: pattern
dtype: string
- name: test_id
dtype: int64
- name: negation_type
dtype: string
- name: semantic_type
dtype: string
- name: syntactic_scope
dtype: string
- name: isDistractor
dtype: bool
- name: label
dtype: bool
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 41264658
num_examples: 268505
- name: validation
num_bytes: 3056321
num_examples: 22514
- name: test
num_bytes: 12684749
num_examples: 90281
download_size: 6311034
dataset_size: 57005728
task_categories:
- text-classification
language:
- en
tags:
- commonsense
- negation
- LLMs
- LLM
pretty_name: This is NOT a Dataset
size_categories:
- 100K<n<1M
multilinguality:
- monolingual
source_datasets:
- original
paperswithcode_id: this-is-not-a-dataset
---
<p align="center">
<img src="https://github.com/hitz-zentroa/This-is-not-a-Dataset/raw/main/assets/tittle.png" style="height: 250px;">
</p>
<h3 align="center">"A Large Negation Benchmark to Challenge Large Language Models"</h3>
<p align="justify">
We introduce a large semi-automatically generated dataset of ~400,000 descriptive sentences about commonsense knowledge that can be true or false in which negation is present in about 2/3 of the corpus in different forms that we use to evaluate LLMs.
</p>
- 📖 Paper: [This is not a Dataset: A Large Negation Benchmark to Challenge Large Language Models (EMNLP'23)](http://arxiv.org/abs/2310.15941)
- 💻 Baseline Code and the Official Scorer: [https://github.com/hitz-zentroa/This-is-not-a-Dataset](https://github.com/hitz-zentroa/This-is-not-a-Dataset)
<p align="center">
<img src="https://github.com/hitz-zentroa/This-is-not-a-Dataset/blob/main/assets/example.png?raw=true" style="height: 450px;">
</p>
# Data explanation
- **pattern_id** (int): The ID of the pattern,in range [1,11]
- **pattern** (str): The name of the pattern
- **test_id** (int): For each pattern we use a set of templates to instanciate the triples. Examples are grouped in triples by test id
- **negation_type** (str): Affirmation, verbal, non-verbal
- **semantic_type** (str): None (for affirmative sentences), analytic, synthetic
- **syntactic_scope** (str): None (for affirmative sentences), clausal, subclausal
- **isDistractor** (bool): We use distractors (randonly selectec synsets) to generate false kwoledge.
- **<span style="color:green">sentence</span>** (str): The sentence. <ins>This is the input of the model</ins>
- **<span style="color:green">label</span>** (bool): The label of the example, True if the statement is true, False otherwise. <ins>This is the target of the model</ins>
If you want to run experiments with this dataset, please, use the [Official Scorer](https://github.com/hitz-zentroa/This-is-not-a-Dataset#scorer) to ensure reproducibility and fairness.
# Citation
```bibtex
@inproceedings{garcia-ferrero-etal-2023-dataset,
title = "This is not a Dataset: A Large Negation Benchmark to Challenge Large Language Models",
author = "Garc{\'\i}a-Ferrero, Iker and
Altuna, Bego{\~n}a and
Alvez, Javier and
Gonzalez-Dios, Itziar and
Rigau, German",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.531",
doi = "10.18653/v1/2023.emnlp-main.531",
pages = "8596--8615",
abstract = "Although large language models (LLMs) have apparently acquired a certain level of grammatical knowledge and the ability to make generalizations, they fail to interpret negation, a crucial step in Natural Language Processing. We try to clarify the reasons for the sub-optimal performance of LLMs understanding negation. We introduce a large semi-automatically generated dataset of circa 400,000 descriptive sentences about commonsense knowledge that can be true or false in which negation is present in about 2/3 of the corpus in different forms. We have used our dataset with the largest available open LLMs in a zero-shot approach to grasp their generalization and inference capability and we have also fine-tuned some of the models to assess whether the understanding of negation can be trained. Our findings show that, while LLMs are proficient at classifying affirmative sentences, they struggle with negative sentences and lack a deep understanding of negation, often relying on superficial cues. Although fine-tuning the models on negative sentences improves their performance, the lack of generalization in handling negation is persistent, highlighting the ongoing challenges of LLMs regarding negation understanding and generalization. The dataset and code are publicly available.",
}
``` |
open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat | ---
pretty_name: Evaluation run of shareAI/llama2-13b-Chinese-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shareAI/llama2-13b-Chinese-chat](https://huggingface.co/shareAI/llama2-13b-Chinese-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T14:15:31.238109](https://huggingface.co/datasets/open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat/blob/main/results_2023-09-22T14-15-31.238109.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788268803,\n \"f1\": 0.062396182885906,\n\
\ \"f1_stderr\": 0.0013783953134948932,\n \"acc\": 0.4400498930990388,\n\
\ \"acc_stderr\": 0.010318502304108787\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268803,\n\
\ \"f1\": 0.062396182885906,\n \"f1_stderr\": 0.0013783953134948932\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \
\ \"acc_stderr\": 0.008744810131034052\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\
\ }\n}\n```"
repo_url: https://huggingface.co/shareAI/llama2-13b-Chinese-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|arc:challenge|25_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T14_15_31.238109
path:
- '**/details_harness|drop|3_2023-09-22T14-15-31.238109.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T14-15-31.238109.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T14_15_31.238109
path:
- '**/details_harness|gsm8k|5_2023-09-22T14-15-31.238109.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T14-15-31.238109.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hellaswag|10_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T17:02:56.948315.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T17:02:56.948315.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T17:02:56.948315.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T14_15_31.238109
path:
- '**/details_harness|winogrande|5_2023-09-22T14-15-31.238109.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T14-15-31.238109.parquet'
- config_name: results
data_files:
- split: 2023_08_09T17_02_56.948315
path:
- results_2023-08-09T17:02:56.948315.parquet
- split: 2023_09_22T14_15_31.238109
path:
- results_2023-09-22T14-15-31.238109.parquet
- split: latest
path:
- results_2023-09-22T14-15-31.238109.parquet
---
# Dataset Card for Evaluation run of shareAI/llama2-13b-Chinese-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/shareAI/llama2-13b-Chinese-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [shareAI/llama2-13b-Chinese-chat](https://huggingface.co/shareAI/llama2-13b-Chinese-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T14:15:31.238109](https://huggingface.co/datasets/open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat/blob/main/results_2023-09-22T14-15-31.238109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268803,
"f1": 0.062396182885906,
"f1_stderr": 0.0013783953134948932,
"acc": 0.4400498930990388,
"acc_stderr": 0.010318502304108787
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268803,
"f1": 0.062396182885906,
"f1_stderr": 0.0013783953134948932
},
"harness|gsm8k|5": {
"acc": 0.11372251705837756,
"acc_stderr": 0.008744810131034052
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DanGlado/ddpm-butterflies-128 | ---
license: other
---
|
open-llm-leaderboard/details_garage-bAInd__Platypus2-70B | ---
pretty_name: Evaluation run of garage-bAInd/Platypus2-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [garage-bAInd/Platypus2-70B](https://huggingface.co/garage-bAInd/Platypus2-70B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Platypus2-70B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T01:27:19.477950](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-70B/blob/main/results_2023-10-13T01-27-19.477950.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4649748322147651,\n\
\ \"em_stderr\": 0.005107889346229416,\n \"f1\": 0.5141369546979866,\n\
\ \"f1_stderr\": 0.004846183113432682,\n \"acc\": 0.58713939251053,\n\
\ \"acc_stderr\": 0.011581424079479265\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4649748322147651,\n \"em_stderr\": 0.005107889346229416,\n\
\ \"f1\": 0.5141369546979866,\n \"f1_stderr\": 0.004846183113432682\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3305534495830174,\n \
\ \"acc_stderr\": 0.012957496367085028\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873502\n\
\ }\n}\n```"
repo_url: https://huggingface.co/garage-bAInd/Platypus2-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|arc:challenge|25_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T01_27_19.477950
path:
- '**/details_harness|drop|3_2023-10-13T01-27-19.477950.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T01-27-19.477950.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T01_27_19.477950
path:
- '**/details_harness|gsm8k|5_2023-10-13T01-27-19.477950.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T01-27-19.477950.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hellaswag|10_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T02:16:23.299080.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T02:16:23.299080.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T02:16:23.299080.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T01_27_19.477950
path:
- '**/details_harness|winogrande|5_2023-10-13T01-27-19.477950.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T01-27-19.477950.parquet'
- config_name: results
data_files:
- split: 2023_08_10T02_16_23.299080
path:
- results_2023-08-10T02:16:23.299080.parquet
- split: 2023_10_13T01_27_19.477950
path:
- results_2023-10-13T01-27-19.477950.parquet
- split: latest
path:
- results_2023-10-13T01-27-19.477950.parquet
---
# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/garage-bAInd/Platypus2-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [garage-bAInd/Platypus2-70B](https://huggingface.co/garage-bAInd/Platypus2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Platypus2-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T01:27:19.477950](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-70B/blob/main/results_2023-10-13T01-27-19.477950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4649748322147651,
"em_stderr": 0.005107889346229416,
"f1": 0.5141369546979866,
"f1_stderr": 0.004846183113432682,
"acc": 0.58713939251053,
"acc_stderr": 0.011581424079479265
},
"harness|drop|3": {
"em": 0.4649748322147651,
"em_stderr": 0.005107889346229416,
"f1": 0.5141369546979866,
"f1_stderr": 0.004846183113432682
},
"harness|gsm8k|5": {
"acc": 0.3305534495830174,
"acc_stderr": 0.012957496367085028
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873502
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
cnmoro/Instruct-PTBR-10M | ---
license: mit
---
|
rbeauchamp/blip_50k_train | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: seed
dtype: uint32
- name: step
dtype: uint16
- name: cfg
dtype: float32
- name: sampler
dtype: string
- name: width
dtype: uint16
- name: height
dtype: uint16
- name: user_name
dtype: string
- name: timestamp
dtype: timestamp[us, tz=UTC]
- name: image_nsfw
dtype: float32
- name: prompt_nsfw
dtype: float32
splits:
- name: train
num_bytes: 18278206606.4
num_examples: 40000
download_size: 18679265100
dataset_size: 18278206606.4
---
# Dataset Card for "blip_50k_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__feed-sen_vi-894567-2175669984 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-125m
metrics: []
dataset_name: futin/feed
dataset_config: sen_vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-125m
* Dataset: futin/feed
* Config: sen_vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
liuyanchen1015/MULTI_VALUE_sst2_volition_changes | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 234
num_examples: 2
- name: test
num_bytes: 1458
num_examples: 10
- name: train
num_bytes: 19018
num_examples: 176
download_size: 13238
dataset_size: 20710
---
# Dataset Card for "MULTI_VALUE_sst2_volition_changes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/minegumo_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of minegumo (Kantai Collection)
This is the dataset of minegumo (Kantai Collection), containing 421 images and their tags.
The core tags of this character are `long_hair, braid, twin_braids, light_brown_hair, red_eyes, breasts, gradient_hair, multicolored_hair, brown_eyes, bow, red_bow, large_breasts, plaid_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 421 | 404.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minegumo_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 421 | 254.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minegumo_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 960 | 536.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minegumo_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 421 | 367.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minegumo_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 960 | 709.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minegumo_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/minegumo_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, arm_warmers, grey_skirt, looking_at_viewer, plaid_bowtie, pleated_skirt, red_bowtie, school_uniform, short_sleeves, simple_background, solo, suspender_skirt, white_shirt, cowboy_shot, smile, white_background, low_twin_braids, open_mouth, dated |
| 1 | 11 |  |  |  |  |  | 1girl, cleavage, solo, open_mouth, blue_bra, collarbone, looking_at_viewer, medium_breasts, navel, underwear_only, blue_panties, blush, cowboy_shot |
| 2 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, upper_body, blush, smile, turtleneck, brown_sweater, long_sleeves, simple_background, red_sweater, alternate_costume, white_background, low_twin_braids, one-hour_drawing_challenge |
| 3 | 8 |  |  |  |  |  | 1girl, dress, long_sleeves, solo, blush, black_pantyhose, simple_background, white_background, beans, brown_sweater, cloud_print, full_body, looking_at_viewer, masu, open_mouth, box, setsubun, shoes, smile, black_footwear, red_sweater |
| 4 | 24 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, collarbone, simple_background, white_background, cleavage, open_mouth, cowboy_shot, bikini, one-hour_drawing_challenge, navel, twitter_username, blue_one-piece_swimsuit, smile |
| 5 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_apron, enmaided, frilled_apron, maid_headdress, waist_apron, simple_background, white_background, cowboy_shot, one-hour_drawing_challenge, cleavage, dated, red_bowtie, white_thighhighs, black_dress, low_twin_braids, skirt, white_shirt, wrist_cuffs |
| 6 | 6 |  |  |  |  |  | 1girl, alternate_costume, cheerleader, holding_pom_poms, midriff, navel, open_mouth, pleated_skirt, sleeveless_shirt, smile, solo, blush, looking_at_viewer, crop_top_overhang, simple_background, white_background, bike_shorts, black_shorts, cowboy_shot, one-hour_drawing_challenge, shorts_under_skirt, white_skirt |
| 7 | 5 |  |  |  |  |  | 1girl, cleavage, detached_collar, playboy_bunny, rabbit_ears, simple_background, strapless_leotard, white_background, black_leotard, brown_pantyhose, fake_animal_ears, looking_at_viewer, solo, wrist_cuffs, alternate_costume, blush, red_bowtie, artist_logo, black_pantyhose, dated, low_twin_braids, medium_breasts, open_mouth, rabbit_tail |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | arm_warmers | grey_skirt | looking_at_viewer | plaid_bowtie | pleated_skirt | red_bowtie | school_uniform | short_sleeves | simple_background | solo | suspender_skirt | white_shirt | cowboy_shot | smile | white_background | low_twin_braids | open_mouth | dated | cleavage | blue_bra | collarbone | medium_breasts | navel | underwear_only | blue_panties | blush | upper_body | turtleneck | brown_sweater | long_sleeves | red_sweater | alternate_costume | one-hour_drawing_challenge | dress | black_pantyhose | beans | cloud_print | full_body | masu | box | setsubun | shoes | black_footwear | bikini | twitter_username | blue_one-piece_swimsuit | white_apron | enmaided | frilled_apron | maid_headdress | waist_apron | white_thighhighs | black_dress | skirt | wrist_cuffs | cheerleader | holding_pom_poms | midriff | sleeveless_shirt | crop_top_overhang | bike_shorts | black_shorts | shorts_under_skirt | white_skirt | detached_collar | playboy_bunny | rabbit_ears | strapless_leotard | black_leotard | brown_pantyhose | fake_animal_ears | artist_logo | rabbit_tail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------------|:--------------------|:---------------|:----------------|:-------------|:-----------------|:----------------|:--------------------|:-------|:------------------|:--------------|:--------------|:--------|:-------------------|:------------------|:-------------|:--------|:-----------|:-----------|:-------------|:-----------------|:--------|:-----------------|:---------------|:--------|:-------------|:-------------|:----------------|:---------------|:--------------|:--------------------|:-----------------------------|:--------|:------------------|:--------|:--------------|:------------|:-------|:------|:-----------|:--------|:-----------------|:---------|:-------------------|:--------------------------|:--------------|:-----------|:----------------|:-----------------|:--------------|:-------------------|:--------------|:--------|:--------------|:--------------|:-------------------|:----------|:-------------------|:--------------------|:--------------|:---------------|:---------------------|:--------------|:------------------|:----------------|:--------------|:--------------------|:----------------|:------------------|:-------------------|:--------------|:--------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | | X | | | | | | | X | | | X | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | | X | | | | | | X | X | | | | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | X | | | | | | X | X | | | | X | X | | X | | | | | | | | | X | | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 24 |  |  |  |  |  | X | | | X | | | | | | X | X | | | X | X | X | | X | | X | | X | | X | | | X | | | | | | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | | | X | | | X | | | X | X | | X | X | | X | X | | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | | X | | | | X | X | | | X | X | X | | X | | | | | | X | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | | | X | | | X | X | | | | | X | X | X | X | X | | | X | | | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
reciprocate/tinygsm_dpo | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: selected
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 8434939.647259163
num_examples: 5857
- name: test
num_bytes: 445005.35274083685
num_examples: 309
download_size: 3047519
dataset_size: 8879945.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
armonia/wasm-smart-contract | ---
license: mit
---
|
lin-df4g/3.0 | ---
license: gpl-3.0
---
|
HydraLM/partitioned_v3_32 | ---
dataset_info:
features:
- name: conversations
list:
- name: input
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: cluster_text
dtype: string
- name: embedding
sequence: float64
- name: unique_id
dtype: string
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 5930615489
num_examples: 599929
download_size: 4013780446
dataset_size: 5930615489
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chuckreynolds/wikimedia-enterprise-wikiquote-english | ---
license: cc-by-sa-3.0
task_categories:
- conversational
language:
- en
pretty_name: Wikimedia Enterprise Wikiquote English snapshot
---
A Wikimedia Enterprise Snapshot from December 1, 2023 of English WikiQuote project.
- Docs => https://enterprise.wikimedia.com/docs/snapshot/
- Schema => https://enterprise.wikimedia.com/docs/data-dictionary/ |
notsobad9527/chinese-joke | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-attention-sparsity-20 | ---
pretty_name: Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-attention-sparsity-20
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wang7776/Mistral-7B-Instruct-v0.2-attention-sparsity-20](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-attention-sparsity-20)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-attention-sparsity-20\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T20:11:05.544103](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-attention-sparsity-20/blob/main/results_2024-01-25T20-11-05.544103.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6080107405407549,\n\
\ \"acc_stderr\": 0.033123570691062657,\n \"acc_norm\": 0.6125186012133447,\n\
\ \"acc_norm_stderr\": 0.033796374202489106,\n \"mc1\": 0.5348837209302325,\n\
\ \"mc1_stderr\": 0.017460849975873972,\n \"mc2\": 0.6826355141109229,\n\
\ \"mc2_stderr\": 0.015165454014454297\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403082,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142825\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6682931686914957,\n\
\ \"acc_stderr\": 0.004698640688271199,\n \"acc_norm\": 0.8484365664210317,\n\
\ \"acc_norm_stderr\": 0.003578643387547847\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334388,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334388\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601688,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601688\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n\
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176085,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176085\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.01480538447837115,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.01480538447837115\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208176,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208176\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.02641560191438898,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.02641560191438898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495033,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495033\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n\
\ \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n\
\ \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529672,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529672\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249765,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249765\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5348837209302325,\n\
\ \"mc1_stderr\": 0.017460849975873972,\n \"mc2\": 0.6826355141109229,\n\
\ \"mc2_stderr\": 0.015165454014454297\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643416\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39727065959059893,\n \
\ \"acc_stderr\": 0.01347865965233779\n }\n}\n```"
repo_url: https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-attention-sparsity-20
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|arc:challenge|25_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|gsm8k|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hellaswag|10_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T20-11-05.544103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T20-11-05.544103.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- '**/details_harness|winogrande|5_2024-01-25T20-11-05.544103.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T20-11-05.544103.parquet'
- config_name: results
data_files:
- split: 2024_01_25T20_11_05.544103
path:
- results_2024-01-25T20-11-05.544103.parquet
- split: latest
path:
- results_2024-01-25T20-11-05.544103.parquet
---
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-attention-sparsity-20
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-attention-sparsity-20](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-attention-sparsity-20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-attention-sparsity-20",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T20:11:05.544103](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-attention-sparsity-20/blob/main/results_2024-01-25T20-11-05.544103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6080107405407549,
"acc_stderr": 0.033123570691062657,
"acc_norm": 0.6125186012133447,
"acc_norm_stderr": 0.033796374202489106,
"mc1": 0.5348837209302325,
"mc1_stderr": 0.017460849975873972,
"mc2": 0.6826355141109229,
"mc2_stderr": 0.015165454014454297
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.014409825518403082,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.014117971901142825
},
"harness|hellaswag|10": {
"acc": 0.6682931686914957,
"acc_stderr": 0.004698640688271199,
"acc_norm": 0.8484365664210317,
"acc_norm_stderr": 0.003578643387547847
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334388,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334388
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176085,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176085
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763079,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763079
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459156,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459156
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.01480538447837115,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.01480538447837115
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208176,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208176
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.02641560191438898,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.02641560191438898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495033,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495033
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529672,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529672
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249765,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249765
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5348837209302325,
"mc1_stderr": 0.017460849975873972,
"mc2": 0.6826355141109229,
"mc2_stderr": 0.015165454014454297
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643416
},
"harness|gsm8k|5": {
"acc": 0.39727065959059893,
"acc_stderr": 0.01347865965233779
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HyaDoo/hd-bert-voicephishing-binary-classification-ver5 | ---
license: apache-2.0
---
|
thegoodfellas/brwac_tiny | ---
annotations_creators:
- no-annotation
language:
- pt
language_creators:
- found
license:
- mit
multilinguality:
- monolingual
pretty_name: brwac
size_categories:
- 10M<n<100M
source_datasets:
- original
tags:
- ufrgs
- nlp
- brazil
task_categories:
- fill-mask
task_ids:
- masked-language-modeling
---
# Dataset Card for BrWac
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [BrWaC homepage](https://www.inf.ufrgs.br/pln/wiki/index.php?title=BrWaC)
- **Repository:** [BrWaC repository](https://www.inf.ufrgs.br/pln/wiki/index.php?title=BrWaC)
- **Paper:** [The brWaC Corpus: A New Open Resource for Brazilian Portuguese](https://www.aclweb.org/anthology/L18-1686/)
- **Point of Contact:** [Jorge A. Wagner Filho](mailto:jawfilho@inf.ufrgs.br)
### Dataset Summary
The BrWaC (Brazilian Portuguese Web as Corpus) is a large corpus constructed following the Wacky framework,
which was made public for research purposes. The current corpus version, released in January 2017, is composed by
3.53 million documents, 2.68 billion tokens and 5.79 million types. Please note that this resource is available
solely for academic research purposes, and you agreed not to use it for any commercial applications.
Manually download at https://www.inf.ufrgs.br/pln/wiki/index.php?title=BrWaC
This is a Tiny version of the entire dataset for educational purposes. Please, refer to https://github.com/the-good-fellas/xlm-roberta-pt-br
### Supported Tasks and Leaderboards
Initially meant for fill-mask task.
### Languages
Brazilian Portuguese
## Dataset Creation
### Personal and Sensitive Information
All data were extracted from public sites.
### Licensing Information
MIT
### Citation Information
```
@inproceedings{wagner2018brwac,
title={The brwac corpus: A new open resource for brazilian portuguese},
author={Wagner Filho, Jorge A and Wilkens, Rodrigo and Idiart, Marco and Villavicencio, Aline},
booktitle={Proceedings of the Eleventh International Conference on Language Resources and Evaluation
(LREC 2018)},
year={2018}
}
```
### Contributions
Thanks to [@the-good-fellas](https://github.com/the-good-fellas) for adding this dataset as hf format. |
bulkbeings/patient-alumini-v1 | ---
license: mit
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_249 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1102881372.0
num_examples: 216591
download_size: 1125482720
dataset_size: 1102881372.0
---
# Dataset Card for "chunk_249"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
schen357/corpjargon | ---
language:
- en
size_categories:
- n<1K
--- |
obahamonde/qa-latam | ---
dataset_info:
features:
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 9193520
num_examples: 5710
download_size: 3774319
dataset_size: 9193520
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alzoubi36/policy_qa | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: validation
num_bytes: 2902927
num_examples: 3809
- name: test
num_bytes: 3667235
num_examples: 4152
- name: train
num_bytes: 13859759
num_examples: 17056
download_size: 2662048
dataset_size: 20429921
---
# Dataset for the PolicyQA task in the [PrivacyGLUE](https://github.com/infsys-lab/privacy-glue) dataset
|
freshpearYoon/vr_train_free_53 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 5921878248
num_examples: 10000
download_size: 906446708
dataset_size: 5921878248
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HydraLM/partitioned_v3_light | ---
configs:
- config_name: default
data_files:
- split: '0'
path: data/0-*
- split: '1'
path: data/1-*
- split: '2'
path: data/2-*
- split: '3'
path: data/3-*
- split: '4'
path: data/4-*
- split: '5'
path: data/5-*
- split: '6'
path: data/6-*
- split: '7'
path: data/7-*
- split: '8'
path: data/8-*
- split: '9'
path: data/9-*
- split: '10'
path: data/10-*
- split: '11'
path: data/11-*
- split: '12'
path: data/12-*
- split: '13'
path: data/13-*
- split: '14'
path: data/14-*
- split: '15'
path: data/15-*
- split: '16'
path: data/16-*
- split: '17'
path: data/17-*
- split: '18'
path: data/18-*
- split: '19'
path: data/19-*
- split: '20'
path: data/20-*
- split: '21'
path: data/21-*
- split: '22'
path: data/22-*
- split: '23'
path: data/23-*
- split: '24'
path: data/24-*
- split: '25'
path: data/25-*
- split: '26'
path: data/26-*
- split: '27'
path: data/27-*
- split: '28'
path: data/28-*
- split: '29'
path: data/29-*
- split: '30'
path: data/30-*
- split: '31'
path: data/31-*
dataset_info:
features:
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: cluster_text
dtype: string
- name: unique_id
dtype: string
- name: cluster
dtype: int64
- name: id
dtype: int64
splits:
- name: '0'
num_bytes: 30992664
num_examples: 16523
- name: '1'
num_bytes: 52095796
num_examples: 16425
- name: '2'
num_bytes: 47561841
num_examples: 25909
- name: '3'
num_bytes: 2815376
num_examples: 5684
- name: '4'
num_bytes: 58605236
num_examples: 21059
- name: '5'
num_bytes: 8155103
num_examples: 6470
- name: '6'
num_bytes: 128701190
num_examples: 24422
- name: '7'
num_bytes: 38130966
num_examples: 26253
- name: '8'
num_bytes: 11186625
num_examples: 15819
- name: '9'
num_bytes: 39419303
num_examples: 14042
- name: '10'
num_bytes: 21521823
num_examples: 7654
- name: '11'
num_bytes: 120962836
num_examples: 23956
- name: '12'
num_bytes: 36300158
num_examples: 14898
- name: '13'
num_bytes: 24926182
num_examples: 23098
- name: '14'
num_bytes: 10550746
num_examples: 10271
- name: '15'
num_bytes: 50092026
num_examples: 24944
- name: '16'
num_bytes: 22094384
num_examples: 10785
- name: '17'
num_bytes: 18684676
num_examples: 14417
- name: '18'
num_bytes: 26827192
num_examples: 32254
- name: '19'
num_bytes: 7490725
num_examples: 10446
- name: '20'
num_bytes: 23774066
num_examples: 40593
- name: '21'
num_bytes: 23942749
num_examples: 17353
- name: '22'
num_bytes: 79104576
num_examples: 47188
- name: '23'
num_bytes: 65591366
num_examples: 15443
- name: '24'
num_bytes: 29085329
num_examples: 10707
- name: '25'
num_bytes: 14869667
num_examples: 9539
- name: '26'
num_bytes: 14156821
num_examples: 16207
- name: '27'
num_bytes: 13720088
num_examples: 5294
- name: '28'
num_bytes: 12888055
num_examples: 16797
- name: '29'
num_bytes: 24111036
num_examples: 9189
- name: '30'
num_bytes: 27279270
num_examples: 41940
- name: '31'
num_bytes: 56129266
num_examples: 24350
download_size: 476510182
dataset_size: 1141767137
---
# Dataset Card for "partitioned_v3_light"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tiennv/english-mc4 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 24653765251
num_examples: 14294240
download_size: 15068999152
dataset_size: 24653765251
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "english-mc4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andrey200702/UPD | ---
license: apache-2.0
---
|
chenhaodev/aocnp_oncc_practice_test | ---
dataset_info:
features:
- name: input
list:
- name: content
dtype: string
- name: role
dtype: string
- name: ideal
dtype: string
splits:
- name: train
num_bytes: 29779
num_examples: 100
download_size: 19617
dataset_size: 29779
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SaffalPoosh/sample_controlnet_dataset | ---
license: apache-2.0
task_categories:
- text-to-image
language:
- en
tags:
- code
pretty_name: ControlNet training
---
# ControlNet training
this dataset is subset of **fill_50k** dataset just to test the finetuning logic.
> *TODO*:
- [ ] add text data
|
open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B | ---
pretty_name: Evaluation run of joey00072/ToxicHermes-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [joey00072/ToxicHermes-2.5-Mistral-7B](https://huggingface.co/joey00072/ToxicHermes-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-23T17:12:50.867091](https://huggingface.co/datasets/open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B/blob/main/results_2023-12-23T17-12-50.867091.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6310556791538692,\n\
\ \"acc_stderr\": 0.032203868447530745,\n \"acc_norm\": 0.6402886061157618,\n\
\ \"acc_norm_stderr\": 0.032897055185662744,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5083945294452454,\n\
\ \"mc2_stderr\": 0.015230833666821306\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6032423208191127,\n \"acc_stderr\": 0.014296513020180646,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756562\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6448914558852819,\n\
\ \"acc_stderr\": 0.004775681871529863,\n \"acc_norm\": 0.8374825731925911,\n\
\ \"acc_norm_stderr\": 0.003681708282581456\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908353,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908353\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608311,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608311\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n\
\ \"acc_stderr\": 0.015445716910998884,\n \"acc_norm\": 0.30837988826815643,\n\
\ \"acc_norm_stderr\": 0.015445716910998884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n\
\ \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n \"\
acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162662,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162662\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5083945294452454,\n\
\ \"mc2_stderr\": 0.015230833666821306\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643414\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17361637604245642,\n \
\ \"acc_stderr\": 0.01043346322125761\n }\n}\n```"
repo_url: https://huggingface.co/joey00072/ToxicHermes-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|arc:challenge|25_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|gsm8k|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hellaswag|10_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-12-50.867091.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T17-12-50.867091.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- '**/details_harness|winogrande|5_2023-12-23T17-12-50.867091.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-23T17-12-50.867091.parquet'
- config_name: results
data_files:
- split: 2023_12_23T17_12_50.867091
path:
- results_2023-12-23T17-12-50.867091.parquet
- split: latest
path:
- results_2023-12-23T17-12-50.867091.parquet
---
# Dataset Card for Evaluation run of joey00072/ToxicHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [joey00072/ToxicHermes-2.5-Mistral-7B](https://huggingface.co/joey00072/ToxicHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:12:50.867091](https://huggingface.co/datasets/open-llm-leaderboard/details_joey00072__ToxicHermes-2.5-Mistral-7B/blob/main/results_2023-12-23T17-12-50.867091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6310556791538692,
"acc_stderr": 0.032203868447530745,
"acc_norm": 0.6402886061157618,
"acc_norm_stderr": 0.032897055185662744,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5083945294452454,
"mc2_stderr": 0.015230833666821306
},
"harness|arc:challenge|25": {
"acc": 0.6032423208191127,
"acc_stderr": 0.014296513020180646,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756562
},
"harness|hellaswag|10": {
"acc": 0.6448914558852819,
"acc_stderr": 0.004775681871529863,
"acc_norm": 0.8374825731925911,
"acc_norm_stderr": 0.003681708282581456
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608311,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608311
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.015445716910998884,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.015445716910998884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162662,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162662
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5083945294452454,
"mc2_stderr": 0.015230833666821306
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643414
},
"harness|gsm8k|5": {
"acc": 0.17361637604245642,
"acc_stderr": 0.01043346322125761
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Dhika/defectfft | ---
license: unknown
---
|
mikegarts/oa_tell_a_joke_100 | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
- name: METADATA
struct:
- name: link
dtype: string
- name: nsfw
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 59031
num_examples: 100
download_size: 0
dataset_size: 59031
---
# Dataset Card for "oa_tell_a_joke_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mlnchk/CL_nature | ---
license: mit
---
|
davanstrien/fuego-20230322-205840-8c6f25 | ---
tags:
- fuego
fuego:
id: 20230322-205840-8c6f25
status: done
script: script.py
requirements_file: requirements.txt
space_id: davanstrien/fuego-20230322-205840-8c6f25
space_hardware: cpu-basic
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/0a67a744 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1341
dataset_size: 186
---
# Dataset Card for "0a67a744"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
enoahjr/twitter_dataset_1713204733 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 159445
num_examples: 411
download_size: 75089
dataset_size: 159445
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
benayas/snips_artificial_20pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1085422
num_examples: 13084
download_size: 405642
dataset_size: 1085422
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kabachuha/wesnoth-ethea-canon-campaigns | ---
license: gpl-2.0
task_categories:
- text-generation
language:
- en
tags:
- art
- code
- gamedev
- scenarios
- writing
- literature
- wesnoth
--- |
open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85 | ---
pretty_name: Evaluation run of uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85](https://huggingface.co/uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T19:19:22.420919](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85_public/blob/main/results_2023-11-23T19-19-22.420919.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6373539881235634,\n\
\ \"acc_stderr\": 0.032200043467933794,\n \"acc_norm\": 0.6462425671540708,\n\
\ \"acc_norm_stderr\": 0.032891781056948864,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.44867041308885225,\n\
\ \"mc2_stderr\": 0.014511741253113358,\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.06318477348993282,\n\
\ \"f1_stderr\": 0.0013946687452644612\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5802047781569966,\n \"acc_stderr\": 0.014422181226303026,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892893\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6451902011551484,\n\
\ \"acc_stderr\": 0.004774778180345194,\n \"acc_norm\": 0.8430591515634336,\n\
\ \"acc_norm_stderr\": 0.0036300159898963996\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612927,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612927\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973133,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32737430167597764,\n\
\ \"acc_stderr\": 0.015694238967737383,\n \"acc_norm\": 0.32737430167597764,\n\
\ \"acc_norm_stderr\": 0.015694238967737383\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n\
\ \"acc_stderr\": 0.012715404841277738,\n \"acc_norm\": 0.45371577574967403,\n\
\ \"acc_norm_stderr\": 0.012715404841277738\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.44867041308885225,\n\
\ \"mc2_stderr\": 0.014511741253113358\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223194\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \
\ \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.06318477348993282,\n\
\ \"f1_stderr\": 0.0013946687452644612\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.18953752843062927,\n \"acc_stderr\": 0.010795837931896386\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|drop|3_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-19-22.420919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-19-22.420919.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- '**/details_harness|winogrande|5_2023-11-23T19-19-22.420919.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T19-19-22.420919.parquet'
- config_name: results
data_files:
- split: 2023_11_23T19_19_22.420919
path:
- results_2023-11-23T19-19-22.420919.parquet
- split: latest
path:
- results_2023-11-23T19-19-22.420919.parquet
---
# Dataset Card for Evaluation run of uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85](https://huggingface.co/uukuguy/CollectiveCognition-v1.1-Mistral-7B-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T19:19:22.420919](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__CollectiveCognition-v1.1-Mistral-7B-dare-0.85_public/blob/main/results_2023-11-23T19-19-22.420919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6373539881235634,
"acc_stderr": 0.032200043467933794,
"acc_norm": 0.6462425671540708,
"acc_norm_stderr": 0.032891781056948864,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.44867041308885225,
"mc2_stderr": 0.014511741253113358,
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.06318477348993282,
"f1_stderr": 0.0013946687452644612
},
"harness|arc:challenge|25": {
"acc": 0.5802047781569966,
"acc_stderr": 0.014422181226303026,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892893
},
"harness|hellaswag|10": {
"acc": 0.6451902011551484,
"acc_stderr": 0.004774778180345194,
"acc_norm": 0.8430591515634336,
"acc_norm_stderr": 0.0036300159898963996
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612927,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973133,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32737430167597764,
"acc_stderr": 0.015694238967737383,
"acc_norm": 0.32737430167597764,
"acc_norm_stderr": 0.015694238967737383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277738,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.44867041308885225,
"mc2_stderr": 0.014511741253113358
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223194
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.06318477348993282,
"f1_stderr": 0.0013946687452644612
},
"harness|gsm8k|5": {
"acc": 0.18953752843062927,
"acc_stderr": 0.010795837931896386
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.