datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ML-Projects-Kiel/tweetyface | ---
annotations_creators:
- machine-generated
language:
- en
- de
language_creators:
- crowdsourced
license:
- apache-2.0
multilinguality:
- multilingual
pretty_name: tweetyface_en
size_categories:
- 10K<n<100K
source_datasets: []
tags: []
task_categories:
- text-generation
task_ids: []
---
# Dataset Card for "tweetyface"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** [GitHub](https://github.com/ml-projects-kiel/OpenCampus-ApplicationofTransformers)
### Dataset Summary
Dataset containing Tweets from prominent Twitter Users.
The dataset has been created utilizing a crawler for the Twitter API.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English, German
## Dataset Structure
### Data Instances
#### english
- **Size of downloaded dataset files:** 4.77 MB
- **Size of the generated dataset:** 5.92 MB
- **Total amount of disk used:** 4.77 MB
#### german
- **Size of downloaded dataset files:** 2.58 MB
- **Size of the generated dataset:** 3.10 MB
- **Total amount of disk used:** 2.59 MB
An example of 'validation' looks as follows.
```
{
"text": "@SpaceX @Space_Station About twice as much useful mass to orbit as rest of Earth combined",
"label": elonmusk,
"idx": 1001283
}
```
### Data Fields
The data fields are the same among all splits and languages.
- `text`: a `string` feature.
- `label`: a classification label
- `idx`: an `int64` feature.
### Data Splits
| name | train | validation |
| ------- | ----: | ---------: |
| english | 27857 | 6965 |
| german | 10254 | 2564 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
|
CyberMaike/kendl | ---
license: openrail
---
|
Norod78/microsoft-fluentui-emoji-768 | ---
language: en
license: mit
size_categories:
- n<10K
task_categories:
- text-to-image
pretty_name: Microsoft FluentUI Emoji 768x768
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 679617796.94
num_examples: 7564
download_size: 704564297
dataset_size: 679617796.94
tags:
- emoji
- fluentui
---
# Dataset Card for "microsoft-fluentui-emoji-768"
[svg and their file names were converted to images and text from Microsoft's fluentui-emoji repo](https://github.com/microsoft/fluentui-emoji) |
chuquan282/CBD_ERROR_LOGS | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: float64
- name: output
dtype: string
splits:
- name: train
num_bytes: 767437.1685606061
num_examples: 1900
- name: test
num_bytes: 85629.83143939394
num_examples: 212
download_size: 270309
dataset_size: 853067.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/hoshii_miki_theidolmster | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hoshii_miki/星井美希/호시이미키 (THE iDOLM@STER)
This is the dataset of hoshii_miki/星井美希/호시이미키 (THE iDOLM@STER), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, green_eyes, ahoge, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 523.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshii_miki_theidolmster/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 352.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshii_miki_theidolmster/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1116 | 702.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshii_miki_theidolmster/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 484.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshii_miki_theidolmster/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1116 | 912.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshii_miki_theidolmster/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hoshii_miki_theidolmster',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, solo, midriff, navel, skirt, smile, open_mouth, thighhighs |
| 1 | 8 |  |  |  |  |  | 1girl, cleavage, medium_breasts, navel, single_leg_pantyhose, smile, solo, fishnet_pantyhose, midriff, belly_chain, jacket, open_mouth, pink_shorts, necklace, one_eye_closed, star_(symbol), yellow_bra |
| 2 | 13 |  |  |  |  |  | 1girl, open_mouth, smile, solo, blush, ;d, one_eye_closed, star_(symbol) |
| 3 | 10 |  |  |  |  |  | 1girl, smile, solo, blush, looking_at_viewer, simple_background, necklace, open_mouth, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, flower, solo, elbow_gloves, smile, wedding_dress, boots, bridal_veil, hair_ornament, open_mouth, white_dress |
| 5 | 6 |  |  |  |  |  | 1girl, plaid_skirt, school_uniform, solo, smile, open_mouth, star_(symbol), blush, necktie |
| 6 | 7 |  |  |  |  |  | 1girl, cleavage, smile, solo, medium_breasts, open_mouth, day, navel, blush, looking_at_viewer, side-tie_bikini_bottom, sky, beach, cloud, green_bikini, outdoors, wet |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | midriff | navel | skirt | smile | open_mouth | thighhighs | cleavage | medium_breasts | single_leg_pantyhose | fishnet_pantyhose | belly_chain | jacket | pink_shorts | necklace | one_eye_closed | star_(symbol) | yellow_bra | blush | ;d | looking_at_viewer | simple_background | white_background | flower | elbow_gloves | wedding_dress | boots | bridal_veil | hair_ornament | white_dress | plaid_skirt | school_uniform | necktie | day | side-tie_bikini_bottom | sky | beach | cloud | green_bikini | outdoors | wet |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:--------|:--------|:--------|:-------------|:-------------|:-----------|:-----------------|:-----------------------|:--------------------|:--------------|:---------|:--------------|:-----------|:-----------------|:----------------|:-------------|:--------|:-----|:--------------------|:--------------------|:-------------------|:---------|:---------------|:----------------|:--------|:--------------|:----------------|:--------------|:--------------|:-----------------|:----------|:------|:-------------------------|:------|:--------|:--------|:---------------|:-----------|:------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | | | | X | X | | | | | | | | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | | | | X | X | | | | | | | | | X | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | | | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | | | X | X | | | | | | | | | | | X | | X | | | | | | | | | | | | X | X | X | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | X | | X | | X | X | | X | X | | | | | | | | | | X | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
fiveflow/koquad_v2_polyglot_tkd_20th | ---
dataset_info:
features:
- name: context
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1766922390
num_examples: 20000
download_size: 592965039
dataset_size: 1766922390
---
# Dataset Card for "koquad_v2_polyglot_tkd_20th"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zenodia/dreambooth-bee-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1640816.0
num_examples: 6
download_size: 1626376
dataset_size: 1640816.0
---
# Dataset Card for "dreambooth-bee-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_36 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 9147419424.25
num_examples: 95238
download_size: 8330550680
dataset_size: 9147419424.25
---
# Dataset Card for "chunk_36"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Litalp/audio-class | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype: string
splits:
- name: train
num_bytes: 249884107.148
num_examples: 1628
download_size: 249501839
dataset_size: 249884107.148
---
# Dataset Card for "audio-class"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_232 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1198651708.0
num_examples: 235399
download_size: 1225301497
dataset_size: 1198651708.0
---
# Dataset Card for "chunk_232"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ProfessorBob/text-embedding-dataset | ---
dataset_info:
- config_name: Documents
features:
- name: doc_id
dtype: string
- name: doc
dtype: string
splits:
- name: history
num_bytes: 508218
num_examples: 224
- name: religion
num_bytes: 302837
num_examples: 126
- name: recherche
num_bytes: 235256
num_examples: 69
- name: python
num_bytes: 660763
num_examples: 194
download_size: 952235
dataset_size: 1707074
- config_name: MOOC_MCQ_Queries
features:
- name: query_id
dtype: string
- name: query
dtype: string
- name: answers
sequence: string
- name: distractions
sequence: string
- name: relevant_docs
sequence: string
splits:
- name: history
num_bytes: 13156
num_examples: 58
- name: religion
num_bytes: 52563
num_examples: 125
- name: recherche
num_bytes: 18791
num_examples: 52
- name: python
num_bytes: 29759
num_examples: 85
download_size: 80494
dataset_size: 114269
configs:
- config_name: Documents
data_files:
- split: history
path: Documents/history-*
- split: religion
path: Documents/religion-*
- split: recherche
path: Documents/recherche-*
- split: python
path: Documents/python-*
- config_name: MOOC_MCQ_Queries
data_files:
- split: history
path: MOOC_MCQ_Queries/history-*
- split: religion
path: MOOC_MCQ_Queries/religion-*
- split: recherche
path: MOOC_MCQ_Queries/recherche-*
- split: python
path: MOOC_MCQ_Queries/python-*
---
# Text embedding Datasets
The text embedding datasets consist of several (query, passage) paired datasets aiming for text-embedding model finetuning. These datasets are ideal for developing and testing algorithms in the fields of natural language processing, information retrieval, and similar applications.
## Dataset Details
Each dataset in this collection is structured to facilitate the training and evaluation of text-embedding models. The datasets are diverse, covering multiple domains and formats. They are particularly useful for tasks like semantic search, question-answering systems, and document retrieval.
### [MOOC MCQ Queries]
The "MOOC MCQ Queries" dataset is derived from [FUN MOOC](https://www.fun-mooc.fr/fr/), an online platform offering a wide range of French courses across various domains. This dataset is uniquely valuable for its high-quality content, manually curated to assist students in understanding course materials better.
#### Content Overview:
- **Language**: French
- **Domains**:
- History: 57 examples
- Religion: 125 examples
- [Other domains to be added]
- **Dataset Description**:
Each record in the dataset includes the following fields:
```json
{
"query_id": "Unique identifier for each query",
"query": "Text of the multiple-choice question (MCQ)",
"answers": ["List of correct answer choices"],
"distractions": ["List of incorrect choices"],
"relevant_docs": ["List of relevant document IDs aiding the answer"]
}
```
- **statistics**:
| Category | Num. of Queries | Query Avg. Words | Number of Docs | Short Docs (<375 words) | Long Docs (≥375 words) | Doc Avg. Words |
|----------------|-----------------|------------------|----------------|-------------------------|------------------------|----------------|
| history | 57 | 11.31 | 224 | 147 | 77 | 351.79 |
| religion | 125 | 15.08 | 126 | 78 | 48 | 375.63 |
| recherche | 52 | 12.71 | 69 | 20 | 49 | 535.00 |
| python | 85 | 21.24 | 194 | 27 | 167 | 552.60 |
### [Wikitext generated Queries]
To complete
### [Documents]
This dataset is an extensive collection of document chunkings or entire document for short texts, designed to complement the MOOC MCQ Queries and other datasets in the collection.
- **chunking strategies**:
- MOOC MCQ Queries: documents are chunked according to their natural divisions, like sections or subsections, ensuring that each chunk maintains contextual integrity.
- **content format**:
```json
{
"doc_id": "Unique identifier for each document",
"doc": "Text content of the document"
}
``` |
open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b | ---
pretty_name: Evaluation run of pankajmathur/orca_mini_v3_7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pankajmathur/orca_mini_v3_7b](https://huggingface.co/pankajmathur/orca_mini_v3_7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T09:53:37.786344](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b/blob/main/results_2023-10-24T09-53-37.786344.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08043204697986577,\n\
\ \"em_stderr\": 0.0027851341980506704,\n \"f1\": 0.15059563758389252,\n\
\ \"f1_stderr\": 0.0030534563383277672,\n \"acc\": 0.4069827001752661,\n\
\ \"acc_stderr\": 0.009686225873410097\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08043204697986577,\n \"em_stderr\": 0.0027851341980506704,\n\
\ \"f1\": 0.15059563758389252,\n \"f1_stderr\": 0.0030534563383277672\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \
\ \"acc_stderr\": 0.007086462127954491\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865706\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pankajmathur/orca_mini_v3_7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|arc:challenge|25_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T09_53_37.786344
path:
- '**/details_harness|drop|3_2023-10-24T09-53-37.786344.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T09-53-37.786344.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T09_53_37.786344
path:
- '**/details_harness|gsm8k|5_2023-10-24T09-53-37.786344.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T09-53-37.786344.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hellaswag|10_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T09-56-47.532864.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T09-56-47.532864.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T09_53_37.786344
path:
- '**/details_harness|winogrande|5_2023-10-24T09-53-37.786344.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T09-53-37.786344.parquet'
- config_name: results
data_files:
- split: 2023_09_13T09_56_47.532864
path:
- results_2023-09-13T09-56-47.532864.parquet
- split: 2023_10_24T09_53_37.786344
path:
- results_2023-10-24T09-53-37.786344.parquet
- split: latest
path:
- results_2023-10-24T09-53-37.786344.parquet
---
# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pankajmathur/orca_mini_v3_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pankajmathur/orca_mini_v3_7b](https://huggingface.co/pankajmathur/orca_mini_v3_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T09:53:37.786344](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b/blob/main/results_2023-10-24T09-53-37.786344.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08043204697986577,
"em_stderr": 0.0027851341980506704,
"f1": 0.15059563758389252,
"f1_stderr": 0.0030534563383277672,
"acc": 0.4069827001752661,
"acc_stderr": 0.009686225873410097
},
"harness|drop|3": {
"em": 0.08043204697986577,
"em_stderr": 0.0027851341980506704,
"f1": 0.15059563758389252,
"f1_stderr": 0.0030534563383277672
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954491
},
"harness|winogrande|5": {
"acc": 0.7426992896606156,
"acc_stderr": 0.012285989618865706
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jan-hq/spider_sql_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1494601
num_examples: 7000
- name: test
num_bytes: 214813
num_examples: 1034
download_size: 405782
dataset_size: 1709414
---
# Dataset Card for "spider_sql_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuan-sf63/word_mask_Nf_32 | ---
dataset_info:
features:
- name: feature
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 7934257.279690487
num_examples: 80487
- name: validation
num_bytes: 881682.7203095123
num_examples: 8944
download_size: 6602823
dataset_size: 8815940.0
---
# Dataset Card for "word_mask_Nf_32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ancerlop/MistralAI | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kgr123/quality_counter_1000 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 5847385
num_examples: 1929
- name: train
num_bytes: 5806234
num_examples: 1935
- name: validation
num_bytes: 5882182
num_examples: 1941
download_size: 4214869
dataset_size: 17535801
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
tr416/dataset_20231007_034029 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73744
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231007_034029"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
swikrit/embedding | ---
license: mit
---
|
TrainingDataPro/spine-magnetic-resonance-imaging-dataset | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-classification
- image-segmentation
- image-to-image
- object-detection
language:
- en
tags:
- medical
- biology
- code
---
# Spine MRI Dataset, Anomaly Detection & Segmentation
The dataset consists of .dcm files containing **MRI scans of the spine** of the person with several dystrophic changes, such as changes in the shape of the spine, osteophytes, disc protrusions, intracerebral lesions, hydromyelia, spondyloarthrosis and spondylosis, anatomical narrowness of the spinal canal and asymmetry of the vertebral arteries. The images are **labeled** by the doctors and accompanied by **report** in PDF-format.
The dataset includes 5 studies, made from the different angles which provide a comprehensive understanding of a several dystrophic changes and useful in training spine anomaly classification algorithms. Each scan includes detailed imaging of the spine, including the *vertebrae, discs, nerves, and surrounding tissues*.
### MRI study angles in the dataset

# 💴 For Commercial Usage: Full version of the dataset includes 20,000 spine studies of people with different conditions, leave a request on **[TrainingData](https://trainingdata.pro/data-market/spine-mri?utm_source=huggingface&utm_medium=cpc&utm_campaign=spine-magnetic-resonance-imaging-dataset)** to buy the dataset
### Types of diseases and conditions in the full dataset:
- Degeneration of discs
- Osteophytes
- Osteochondrosis
- Hemangioma
- Disk extrusion
- Spondylitis
- **AND MANY OTHER CONDITIONS**

Researchers and healthcare professionals can use this dataset to study spinal conditions and disorders, such as herniated discs, spinal stenosis, scoliosis, and fractures. The dataset can also be used to develop and evaluate new imaging techniques, computer algorithms for image analysis, and artificial intelligence models for automated diagnosis.
# 💴 Buy the Dataset: This is just an example of the data. Leave a request on [https://trainingdata.pro/data-market](https://trainingdata.pro/data-market/spine-mri?utm_source=huggingface&utm_medium=cpc&utm_campaign=spine-magnetic-resonance-imaging-dataset) to discuss your requirements, learn about the price and buy the dataset
# Content
### The dataset includes:
- **ST000001**: includes subfolders with 5 studies. Each study includes MRI-scans in **.dcm and .jpg formats**,
- **DICOMDIR**: includes information about the patient's condition and links to access files,
- **Spine_MRI_5.pdf**: includes medical report, provided by the radiologist,
- **.csv file**: includes id of the studies and the number of files
### Medical reports include the following data:
- Patient's **demographic information**,
- **Description** of the case,
- Preliminary **diagnosis**,
- **Recommendations** on the further actions
*All patients consented to the publication of data*
# Medical data might be collected in accordance with your requirements.
## [TrainingData](https://trainingdata.pro/data-market/spine-mri?utm_source=huggingface&utm_medium=cpc&utm_campaign=spine-magnetic-resonance-imaging-dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets**
*keywords: mri spine scans, spinal imaging, radiology dataset, neuroimaging, medical imaging data, image segmentation, lumbar spine mri, thoracic spine mri, cervical spine mri, spine anatomy, spinal cord mri, orthopedic imaging, radiologist dataset, mri scan analysis, spine mri dataset, machine learning medical imaging, spinal abnormalities, image classification, neural network spine scans, mri data analysis, deep learning medical imaging, mri image processing, spine tumor detection, spine injury diagnosis, mri image segmentation, spine mri classification, artificial intelligence in radiology, spine abnormalities detection, spine pathology analysis, mri feature extraction.* |
Aman6917/autotrain-data-fine_tune_table_tm2 | ---
task_categories:
- summarization
---
# AutoTrain Dataset for project: fine_tune_table_tm2
## Dataset Description
This dataset has been automatically processed by AutoTrain for project fine_tune_table_tm2.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "List all PO headers with a valid vendor record in database",
"target": "select * from RETAILBUYER_POHEADER P inner join RETAILBUYER_VENDOR V\non P.VENDOR_ID = V.VENDOR_ID"
},
{
"text": "List all details of PO headers which have a vendor in vendor table",
"target": "select * from RETAILBUYER_POHEADER P inner join RETAILBUYER_VENDOR V\non P.VENDOR_ID = V.VENDOR_ID"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 32 |
| valid | 17 |
|
iamnguyen/fqa_v1 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: vector
sequence: float64
- name: tokenized_question
dtype: string
- name: content
dtype: string
- name: school_id
dtype: string
splits:
- name: train
num_bytes: 2559239
num_examples: 178
download_size: 2035990
dataset_size: 2559239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ziozzang/multi-lang-translation-set | ---
license: mit
language:
- ar
- ko
- en
- ja
- id
- de
- pt
- es
- ru
- fr
- it
---
This is test datasets for multiple language translation.
- Generated by Machine translation.
License
- MIT. |
jlbaker361/korra-lite_captioned-augmented | ---
dataset_info:
features:
- name: image
dtype: image
- name: src
dtype: string
- name: split
dtype: string
- name: id
dtype: int64
- name: caption
dtype: string
splits:
- name: train
num_bytes: 254281033.375
num_examples: 1173
download_size: 254182437
dataset_size: 254281033.375
---
# Dataset Card for "korra-lite_captioned-augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_what_comparative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 989
num_examples: 3
- name: train
num_bytes: 1398
num_examples: 4
download_size: 9495
dataset_size: 2387
---
# Dataset Card for "MULTI_VALUE_mrpc_what_comparative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ArteChile/footos | ---
license: artistic-2.0
---
|
sreejith8100/death_marriage_data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': death
'1': marriage
splits:
- name: train
num_bytes: 579589900.0
num_examples: 448
- name: test
num_bytes: 13589304.0
num_examples: 20
download_size: 593212683
dataset_size: 593179204.0
---
# Dataset Card for "death_marriage_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fathyshalab/germanquad_qg_dataset | ---
license: cc-by-4.0
task_categories:
- text2text-generation
language:
- de
size_categories:
- 1K<n<10K
--- |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-80000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 970107
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TheAIchemist13/malyalam_asr_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: ' transcriptions'
dtype: string
splits:
- name: train
num_bytes: 1437332887.196
num_examples: 3023
- name: test
num_bytes: 576755142.814
num_examples: 1103
download_size: 1668143452
dataset_size: 2014088030.0100002
---
# Dataset Card for "malyalam_asr_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tanvir1337/Allopathic_Drug_Manufacturers-BD | ---
license: odc-by
pretty_name: Allopathic Drug Manufacturers Bangladesh
tags:
- Allopathic
- Drugs
- Manufacturer
language:
- en
size_categories:
- n<1K
---
# Allopathic_Drug_Manufacturers-BD [JSON dataset]
A dataset containing information about allopathic drug manufacturers in Bangladesh. The data is provided by the Directorate General of Drug Administration. The source in PDF format is found in the [MedEx](https://medex.com.bd) website.
## Dataset Contents
The dataset includes the following information for each allopathic drug manufacturer:
- Serial
- Pharmaceutical Name
- Location
- Manufacturing Licence Number (Biological)
- Manufacturing Licence Number (Non-Biological)
- Status
Please note that the dataset may not include all the manufacturers in Bangladesh.
## Note
The original PDF source list contains inaccuracies and incomplete entries, which have been identified and rectified to the best extent possible in the updated JSON list.
## Data Source
- [Allopathic Drug Manufacturers - MedEx](https://medex.com.bd/downloads/KUE7l4PXoVUffEibhCCfyeU6Tn7Key6HtRVE6/allopathic-drug-manufacturers.pdf)
## Disclaimer
Please note that the dataset's accuracy cannot be guaranteed.
|
Herreera1/Instructions_objects | ---
task_categories:
- text-classification
language:
- es
pretty_name: Dataset tesis
size_categories:
- n<1K
--- |
Ahmadsameh8/qalbPreprocessedAndMerged | ---
dataset_info:
features:
- name: correct
dtype: string
- name: incorrect
dtype: string
splits:
- name: train
num_bytes: 19474720
num_examples: 18350
- name: validation
num_bytes: 2355237
num_examples: 2293
- name: test
num_bytes: 2843810
num_examples: 2295
download_size: 12819027
dataset_size: 24673767
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
jxm/scifact__openai_ada2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: embeddings_A
sequence: float64
splits:
- name: train
num_bytes: 66934073
num_examples: 5183
download_size: 67028968
dataset_size: 66934073
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hugfaceguy0001/ChatGPTGroundTruth | ---
license: openrail
task_categories:
- question-answering
language:
- en
tags:
- science
pretty_name: ChatGPT ground truth
size_categories:
- 10K<n<100K
configs:
- config_name: main_data
data_files: "ground_truth.jsonl"
---
# ChatGPT ground truth dataset
This dataset is generated by ChatGPT and contains factual questions and corresponding answers from 160 subfields across natural and social sciences.
Specifically, the dataset covers eight major domains: mathematics, physics, chemistry, biology, medicine, engineering, computer science, and social sciences. Within each domain, 20 specific subfields are selected, with 500 question-answer pairs per subfield, resulting in a total of 80,000 question-answer pairs.
The language used in this dataset is English.
Accompanying the release of this dataset is the script code used to generate it.
# ChatGPT基准事实数据集
本数据集由ChatGPT自动生成,包含自然科学和社会科学的160个细分领域的事实性问题和相应的答案。
具体来说,本数据集涵盖数学、物理、化学、生物学、医学、工程、计算机科学、社会科学八大领域,每个领域选择了20个细分子领域,每个子领域有500个问答对,共80000个问答对。
本数据集的语言为英文。
和本数据集同时发布的还有生成本数据集使用的脚本代码。 |
abhishek/autonlp-data-prodigy-10 | ---
language:
- en
---
# AutoNLP Dataset for project: prodigy-10
## Table of content
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
## Dataset Descritpion
This dataset has been automatically processed by AutoNLP for project prodigy-10.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"tags": [
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8
],
"tokens": [
"tory",
"backing",
"for",
"i",
"d",
"cards",
"the",
"tories",
"are",
"to",
"back",
"controversial",
"government",
"plans",
"to",
"introduce",
"i",
"d",
"cards",
".",
" ",
"the",
"shadow",
"cabinet",
"revealed",
"its",
"support",
"ahead",
"of",
"next",
"week",
"s",
"commons",
"vote",
"on",
"a",
"bill",
"to",
"introduce",
"compulsory",
"i",
"d.",
"the",
"decision",
"follows",
"a",
" ",
"tough",
"meeting",
" ",
"where",
"some",
"senior",
"tories",
"argued",
"vociferously",
"against",
"the",
"move",
" ",
"party",
"sources",
"told",
"the",
"bbc",
".",
"the",
"bill",
" ",
"which",
"ministers",
"claim",
"will",
"tackle",
"crime",
" ",
"terrorism",
"and",
"illegal",
"immigration",
" ",
"is",
"expected",
"to",
"be",
"opposed",
"by",
"the",
"liberal",
"democrats",
".",
" ",
"they",
"have",
"said",
"the",
"scheme",
"is",
" ",
"deeply",
"flawed",
" ",
"and",
"a",
"waste",
"of",
"money",
".",
"sources",
"within",
"the",
"conservative",
"party",
"told",
"the",
"bbc",
"michael",
"howard",
"has",
"always",
"been",
"in",
"favour",
"of",
"i",
"d",
"cards",
" ",
"and",
"tried",
"to",
"introduce",
"them",
"when",
"he",
"was",
"home",
"secretary",
".",
"the",
"party",
"has",
"been",
" ",
"agnostic",
" ",
"on",
"the",
"issue",
"until",
"now",
"but",
"had",
"now",
"decided",
"to",
"come",
"off",
"the",
"fence",
" ",
"the",
"tory",
"source",
"said",
".",
"despite",
"giving",
"their",
"backing",
"to",
"i",
"d",
"cards",
" ",
"the",
"conservatives",
"insisted",
"they",
"would",
"hold",
"ministers",
"to",
"account",
"over",
"the",
"precise",
"purpose",
"of",
"the",
"scheme",
".",
" ",
"they",
"said",
"they",
"would",
"also",
"press",
"labour",
"over",
"whether",
"objectives",
"could",
"be",
"met",
"and",
"whether",
"the",
"home",
"office",
"would",
"deliver",
"them",
".",
"and",
"they",
"pledged",
"to",
"assess",
"the",
"cost",
"effectiveness",
"of",
"i",
"d",
"cards",
"and",
"whether",
"people",
"s",
"privacy",
"would",
"be",
"properly",
"protected",
".",
" ",
"it",
"is",
"important",
"to",
"remember",
"that",
"this",
"bill",
"will",
"take",
"a",
"decade",
"to",
"come",
"into",
"full",
"effect",
" ",
"a",
"spokesman",
"said",
".",
" ",
"it",
"will",
"do",
"nothing",
"to",
"solve",
"the",
"immediate",
"problems",
"of",
"rising",
"crime",
"and",
"uncontrolled",
"immigration",
".",
" ",
"lib",
"dem",
"home",
"affairs",
"spokesman",
"mark",
"oaten",
"said",
":",
" ",
"this",
"has",
"all",
"the",
"signs",
"of",
"michael",
"howard",
"overruling",
"colleagues",
" ",
"concerns",
"over",
"i",
"d",
"cards",
".",
" ",
"the",
"tories",
"should",
"have",
"the",
"courage",
"to",
"try",
"and",
"change",
"public",
"opinion",
"not",
"follow",
"it",
".",
" ",
"the",
"new",
"chairman",
"of",
"the",
"bar",
"council",
" ",
"guy",
"mansfield",
"qc",
"warned",
"there",
"was",
"a",
"real",
"risk",
"that",
"people",
"on",
"the",
" ",
"margins",
"of",
"society",
" ",
"would",
"be",
"driven",
"into",
"the",
"hands",
"of",
"extremists",
".",
" ",
"what",
"is",
"going",
"to",
"happen",
"to",
"young",
"asian",
"men",
"when",
"there",
"has",
"been",
"a",
"bomb",
"gone",
"off",
"somewhere",
" ",
"they",
"are",
"going",
"to",
"be",
"stopped",
".",
"if",
"they",
"haven",
"t",
"[",
"i",
"d",
"cards",
"]",
"they",
"are",
"going",
"to",
"be",
"detained",
"."
]
},
{
"tags": [
2,
6,
8,
8,
0,
8,
0,
8,
8,
8,
2,
6,
6,
8,
8,
8,
8,
8,
8,
8,
8,
8,
0,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
1,
5,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
0,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
0,
8,
8,
8,
0,
8,
2,
6,
8,
2,
6,
8,
8,
8,
8,
8,
8,
8,
8,
0,
8,
8,
8,
8,
8,
8,
8,
8,
2,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
0,
8,
2,
6,
6,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
0,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
0,
8,
2,
6,
6,
8,
8,
8,
0,
8,
2,
6,
8,
8,
8,
8,
2,
6,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
2,
8,
8,
8,
8,
8,
2,
6,
8,
8,
8,
8,
8,
8,
8,
8,
0,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
2,
6,
8,
8,
2,
8,
8,
0,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
2,
6,
8,
8,
0,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
8,
2,
6,
8,
8,
8,
8,
8,
8,
8,
2,
6,
8
],
"tokens": [
"o",
"gara",
"revels",
"in",
"ireland",
"victory",
"ireland",
"fly",
"-",
"half",
"ronan",
"o",
"gara",
"hailed",
"his",
"side",
"s",
"19",
"-",
"13",
"victory",
"over",
"england",
"as",
"a",
" ",
"special",
" ",
"win",
".",
" ",
"the",
"munster",
"number",
"10",
"kicked",
"a",
"total",
"of",
"14",
"points",
" ",
"including",
"two",
"drop",
"goals",
" ",
"to",
"help",
"keep",
"alive",
"their",
"grand",
"slam",
"hopes",
".",
"he",
"told",
"bbc",
"sport",
":",
" ",
"we",
"made",
"hard",
"work",
"of",
"it",
"but",
"it",
"s",
"still",
"special",
"to",
"beat",
"england",
".",
" ",
"i",
"had",
"three",
"chances",
"to",
"win",
"the",
"game",
"but",
"didn",
"t.",
"we",
"have",
"work",
"to",
"do",
"after",
"this",
"but",
"we",
"never",
"take",
"a",
"victory",
"over",
"england",
"lightly",
".",
" ",
"ireland",
"hooker",
"shane",
"byrne",
"echoed",
"o",
"gara",
"s",
"comments",
"but",
"admitted",
"the",
"game",
"had",
"been",
"england",
"s",
"best",
"outing",
"in",
"the",
"six",
"nations",
".",
"byrne",
"said",
":",
" ",
"it",
"was",
"a",
"really",
" ",
"really",
"hard",
"game",
"but",
"from",
"one",
"to",
"15",
"in",
"our",
"team",
"we",
"worked",
"really",
" ",
"really",
"hard",
".",
" ",
"we",
"just",
"had",
"to",
"stick",
"to",
"our",
"defensive",
"pattern",
" ",
"trust",
"ourselves",
"and",
"trust",
"those",
"around",
"us",
".",
"all",
"round",
"it",
"was",
"fantastic",
".",
" ",
"ireland",
"captain",
"brian",
"o",
"driscoll",
" ",
"who",
"scored",
"his",
"side",
"s",
"only",
"try",
" ",
"said",
":",
" ",
"we",
"are",
"delighted",
" ",
"we",
"felt",
"if",
"we",
"performed",
"well",
"then",
"we",
"would",
"win",
"but",
"with",
"england",
"also",
"having",
"played",
"very",
"well",
"it",
"makes",
"it",
"all",
"the",
"sweeter",
".",
" ",
"we",
"did",
"get",
"the",
"bounce",
"of",
"the",
"ball",
"and",
"some",
"days",
"that",
"happens",
"and",
"you",
"ve",
"just",
"got",
"to",
"jump",
"on",
"the",
"back",
"of",
"it",
".",
" ",
"ireland",
"coach",
"eddie",
"o",
"sullivan",
"was",
"surprised",
"that",
"england",
"coach",
"andy",
"robinson",
"said",
"he",
"was",
"certain",
"mark",
"cueto",
"was",
"onside",
"for",
"a",
"disallowed",
"try",
"just",
"before",
"the",
"break",
".",
" ",
"andy",
"was",
"sitting",
"two",
"yards",
"from",
"me",
"and",
"i",
"couldn",
"t",
"see",
"whether",
"he",
"was",
"offside",
"or",
"not",
"so",
"i",
"don",
"t",
"know",
"how",
"andy",
"could",
"have",
"known",
" ",
"said",
"o",
"sullivan",
".",
" ",
"what",
"i",
"do",
"know",
"is",
"that",
"england",
"played",
"well",
"and",
"when",
"that",
"happens",
"it",
"makes",
"a",
"very",
"good",
"victory",
"for",
"us",
".",
" ",
"we",
"had",
"to",
"defend",
"for",
"long",
"periods",
"and",
"that",
"is",
"all",
"good",
"for",
"the",
"confidence",
"of",
"the",
"team",
".",
" ",
"i",
"think",
"our",
"try",
"was",
"very",
"well",
"worked",
" ",
"it",
"was",
"a",
"gem",
" ",
"as",
"good",
"a",
"try",
"as",
"we",
"have",
"scored",
"for",
"a",
"while",
".",
" ",
"o",
"sullivan",
"also",
"rejected",
"robinson",
"s",
"contention",
"england",
"dominated",
"the",
"forward",
"play",
".",
" ",
"i",
"think",
"we",
"lost",
"one",
"lineout",
"and",
"they",
"lost",
"four",
"or",
"five",
"so",
"i",
"don",
"t",
"know",
"how",
"that",
"adds",
"up",
"to",
"domination",
" ",
"he",
"said",
".",
"o",
"driscoll",
"also",
"insisted",
"ireland",
"were",
"happy",
"to",
"handle",
"the",
"pressure",
"of",
"being",
"considered",
"favourites",
"to",
"win",
"the",
"six",
"nations",
"title",
".",
" ",
"this",
"season",
"for",
"the",
"first",
"time",
"we",
"have",
"been",
"able",
"to",
"play",
"with",
"the",
"favourites",
" ",
"tag",
" ",
"he",
"said",
".",
" ",
"hopefully",
"we",
"have",
"proved",
"that",
"today",
"and",
"can",
"continue",
"to",
"keep",
"doing",
"so",
".",
" ",
"as",
"for",
"my",
"try",
"it",
"was",
"a",
"move",
"we",
"had",
"worked",
"on",
"all",
"week",
".",
"there",
"was",
"a",
"bit",
"of",
"magic",
"from",
"geordan",
"murphy",
"and",
"it",
"was",
"a",
"great",
"break",
"from",
"denis",
"hickie",
"."
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"tags": "Sequence(feature=ClassLabel(num_classes=9, names=['B-LOCATION', 'B-ORG', 'B-PERSON', 'B-PRODUCT', 'I-LOCATION', 'I-ORG', 'I-PERSON', 'I-PRODUCT', 'O'], names_file=None, id=None), length=-1, id=None)",
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 186 |
| valid | 58 |
|
Plim/fr_wikipedia_processed | ---
language: fr
--- |
open-llm-leaderboard/details_FelixChao__WestSeverus-7B | ---
pretty_name: Evaluation run of FelixChao/WestSeverus-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/WestSeverus-7B](https://huggingface.co/FelixChao/WestSeverus-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__WestSeverus-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T19:22:25.725845](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WestSeverus-7B/blob/main/results_2024-01-23T19-22-25.725845.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546959623905765,\n\
\ \"acc_stderr\": 0.032069118843639784,\n \"acc_norm\": 0.6545111906904139,\n\
\ \"acc_norm_stderr\": 0.032737272576266796,\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6289063808843572,\n\
\ \"mc2_stderr\": 0.015244465231660157\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.674061433447099,\n \"acc_stderr\": 0.013697432466693247,\n\
\ \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725225\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6905994821748656,\n\
\ \"acc_stderr\": 0.0046130181011853014,\n \"acc_norm\": 0.8746265684126668,\n\
\ \"acc_norm_stderr\": 0.0033046510372765534\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291936,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291936\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"\
acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n\
\ \"acc_stderr\": 0.016482782187500673,\n \"acc_norm\": 0.41564245810055866,\n\
\ \"acc_norm_stderr\": 0.016482782187500673\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031215,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031215\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6289063808843572,\n\
\ \"mc2_stderr\": 0.015244465231660157\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222789\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \
\ \"acc_stderr\": 0.01265254413318614\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/WestSeverus-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|arc:challenge|25_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|gsm8k|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hellaswag|10_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T19-22-25.725845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T19-22-25.725845.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- '**/details_harness|winogrande|5_2024-01-23T19-22-25.725845.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T19-22-25.725845.parquet'
- config_name: results
data_files:
- split: 2024_01_23T19_22_25.725845
path:
- results_2024-01-23T19-22-25.725845.parquet
- split: latest
path:
- results_2024-01-23T19-22-25.725845.parquet
---
# Dataset Card for Evaluation run of FelixChao/WestSeverus-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/WestSeverus-7B](https://huggingface.co/FelixChao/WestSeverus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__WestSeverus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T19:22:25.725845](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WestSeverus-7B/blob/main/results_2024-01-23T19-22-25.725845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6546959623905765,
"acc_stderr": 0.032069118843639784,
"acc_norm": 0.6545111906904139,
"acc_norm_stderr": 0.032737272576266796,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6289063808843572,
"mc2_stderr": 0.015244465231660157
},
"harness|arc:challenge|25": {
"acc": 0.674061433447099,
"acc_stderr": 0.013697432466693247,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725225
},
"harness|hellaswag|10": {
"acc": 0.6905994821748656,
"acc_stderr": 0.0046130181011853014,
"acc_norm": 0.8746265684126668,
"acc_norm_stderr": 0.0033046510372765534
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291936,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291936
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608304,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500673,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500673
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.0127397115540457,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.0127397115540457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031215,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031215
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6289063808843572,
"mc2_stderr": 0.015244465231660157
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222789
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.01265254413318614
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
disham993/Synthetic_Furniture_Dataset | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 190892
num_examples: 1003
download_size: 0
dataset_size: 190892
---
# Dataset Card for "Synthetic_Furniture_Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mstz/compas | ---
language:
- en
tags:
- compas
- tabular_classification
- binary_classification
- UCI
pretty_name: Compas
size_categories:
- 1K<n<10K
task_categories:
- tabular-classification
configs:
- encoding
- two-years-recidividity
- two-years-recidividity-no-race
- priors-prediction
- priors-prediction-no-race
- race
license: cc
---
# Compas
The [Compas dataset](https://github.com/propublica/compas-analysis) for recidivism prediction.
Dataset known to have racial bias issues, check this [Propublica article](https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing) on the topic.
# Configurations and tasks
| **Configuration** | **Task** | Description |
|----------------------------------|---------------------------|-----------------------------------------------------------------|
| encoding | | Encoding dictionary showing original values of encoded features.|
| two-years-recidividity | Binary classification | Will the defendant be a violent recidivist? |
| two-years-recidividity-no-race | Binary classification | As above, but the `race` feature is removed. |
| priors-prediction | Regression | How many prior crimes has the defendant committed? |
| priors-prediction-no-race | Binary classification | As above, but the `race` feature is removed. |
| race | Multiclass classification | What is the `race` of the defendant? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/compas", "two-years-recidividity")["train"]
```
# Features
|**Feature** |**Type** |**Description** |
|---------------------------------------|-----------|---------------------------------------|
|`sex` |`int64` | |
|`age` |`int64` | |
|`race` |`int64` | |
|`number_of_juvenile_fellonies` |`int64` | |
|`decile_score` |`int64` |Criminality score |
|`number_of_juvenile_misdemeanors` |`int64` | |
|`number_of_other_juvenile_offenses` |`int64` | |
|`number_of_prior_offenses` |`int64` | |
|`days_before_screening_arrest` |`int64` | |
|`is_recidivous` |`int64` | |
|`days_in_custody` |`int64` |Days spent in custody |
|`is_violent_recidivous` |`int64` | |
|`violence_decile_score` |`int64` |Criminality score for violent crimes |
|`two_years_recidivous` |`int64` | | |
autoevaluate/autoeval-eval-phpthinh__ex3-all-630c04-1799362235 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/ex3
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-560m
metrics: []
dataset_name: phpthinh/ex3
dataset_config: all
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-560m
* Dataset: phpthinh/ex3
* Config: all
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
open-llm-leaderboard/details_sail__Sailor-1.8B-Chat | ---
pretty_name: Evaluation run of sail/Sailor-1.8B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sail/Sailor-1.8B-Chat](https://huggingface.co/sail/Sailor-1.8B-Chat) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sail__Sailor-1.8B-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T06:45:47.530917](https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-1.8B-Chat/blob/main/results_2024-03-11T06-45-47.530917.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3806365170008389,\n\
\ \"acc_stderr\": 0.034063765218371976,\n \"acc_norm\": 0.3858613655901287,\n\
\ \"acc_norm_stderr\": 0.03490323085374032,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570349,\n \"mc2\": 0.38711002085151214,\n\
\ \"mc2_stderr\": 0.014178729804966983\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3242320819112628,\n \"acc_stderr\": 0.01367881039951882,\n\
\ \"acc_norm\": 0.3575085324232082,\n \"acc_norm_stderr\": 0.014005494275916571\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.43069109739095796,\n\
\ \"acc_stderr\": 0.004941609820763587,\n \"acc_norm\": 0.5712009559848635,\n\
\ \"acc_norm_stderr\": 0.004938930143234449\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.37358490566037733,\n \"acc_stderr\": 0.029773082713319875,\n \
\ \"acc_norm\": 0.37358490566037733,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3194444444444444,\n\
\ \"acc_stderr\": 0.03899073687357335,\n \"acc_norm\": 0.3194444444444444,\n\
\ \"acc_norm_stderr\": 0.03899073687357335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.03656343653353157,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.03656343653353157\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.03047297336338005,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.03047297336338005\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.02786932057166464,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.02786932057166464\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.03898531605579418,\n\
\ \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.03898531605579418\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5025906735751295,\n \"acc_stderr\": 0.03608390745384487,\n\
\ \"acc_norm\": 0.5025906735751295,\n \"acc_norm_stderr\": 0.03608390745384487\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4025641025641026,\n \"acc_stderr\": 0.024864995159767755,\n\
\ \"acc_norm\": 0.4025641025641026,\n \"acc_norm_stderr\": 0.024864995159767755\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766118,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766118\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.031811100324139245,\n\
\ \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.031811100324139245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.44220183486238535,\n \"acc_stderr\": 0.021293613207520212,\n \"\
acc_norm\": 0.44220183486238535,\n \"acc_norm_stderr\": 0.021293613207520212\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.0305467452649532,\n \"acc_norm\"\
: 0.2777777777777778,\n \"acc_norm_stderr\": 0.0305467452649532\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.45098039215686275,\n\
\ \"acc_stderr\": 0.03492406104163613,\n \"acc_norm\": 0.45098039215686275,\n\
\ \"acc_norm_stderr\": 0.03492406104163613\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.510548523206751,\n \"acc_stderr\": 0.032539983791662855,\n\
\ \"acc_norm\": 0.510548523206751,\n \"acc_norm_stderr\": 0.032539983791662855\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.42152466367713004,\n\
\ \"acc_stderr\": 0.03314190222110658,\n \"acc_norm\": 0.42152466367713004,\n\
\ \"acc_norm_stderr\": 0.03314190222110658\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.512396694214876,\n \"acc_stderr\": 0.045629515481807666,\n \"\
acc_norm\": 0.512396694214876,\n \"acc_norm_stderr\": 0.045629515481807666\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.36809815950920244,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.36809815950920244,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5641025641025641,\n\
\ \"acc_stderr\": 0.032485775115784016,\n \"acc_norm\": 0.5641025641025641,\n\
\ \"acc_norm_stderr\": 0.032485775115784016\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4840357598978289,\n\
\ \"acc_stderr\": 0.017870847506081734,\n \"acc_norm\": 0.4840357598978289,\n\
\ \"acc_norm_stderr\": 0.017870847506081734\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4277456647398844,\n \"acc_stderr\": 0.026636539741116086,\n\
\ \"acc_norm\": 0.4277456647398844,\n \"acc_norm_stderr\": 0.026636539741116086\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.01424263007057491,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.01424263007057491\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.02833239748366427,\n\
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.02833239748366427\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n\
\ \"acc_stderr\": 0.02801365189199507,\n \"acc_norm\": 0.4180064308681672,\n\
\ \"acc_norm_stderr\": 0.02801365189199507\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.027339546640662724,\n\
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.027339546640662724\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.02699219917306436,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.02699219917306436\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3044328552803129,\n\
\ \"acc_stderr\": 0.011752877592597574,\n \"acc_norm\": 0.3044328552803129,\n\
\ \"acc_norm_stderr\": 0.011752877592597574\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3627450980392157,\n \"acc_stderr\": 0.01945076843250551,\n \
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.01945076843250551\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.4818181818181818,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n\
\ \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n\
\ \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4093567251461988,\n \"acc_stderr\": 0.03771283107626544,\n\
\ \"acc_norm\": 0.4093567251461988,\n \"acc_norm_stderr\": 0.03771283107626544\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570349,\n \"mc2\": 0.38711002085151214,\n\
\ \"mc2_stderr\": 0.014178729804966983\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5911602209944752,\n \"acc_stderr\": 0.01381695429513568\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0356330553449583,\n \
\ \"acc_stderr\": 0.0051061078537441885\n }\n}\n```"
repo_url: https://huggingface.co/sail/Sailor-1.8B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|arc:challenge|25_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|gsm8k|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hellaswag|10_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T06-45-47.530917.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T06-45-47.530917.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- '**/details_harness|winogrande|5_2024-03-11T06-45-47.530917.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T06-45-47.530917.parquet'
- config_name: results
data_files:
- split: 2024_03_11T06_45_47.530917
path:
- results_2024-03-11T06-45-47.530917.parquet
- split: latest
path:
- results_2024-03-11T06-45-47.530917.parquet
---
# Dataset Card for Evaluation run of sail/Sailor-1.8B-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sail/Sailor-1.8B-Chat](https://huggingface.co/sail/Sailor-1.8B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sail__Sailor-1.8B-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T06:45:47.530917](https://huggingface.co/datasets/open-llm-leaderboard/details_sail__Sailor-1.8B-Chat/blob/main/results_2024-03-11T06-45-47.530917.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3806365170008389,
"acc_stderr": 0.034063765218371976,
"acc_norm": 0.3858613655901287,
"acc_norm_stderr": 0.03490323085374032,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570349,
"mc2": 0.38711002085151214,
"mc2_stderr": 0.014178729804966983
},
"harness|arc:challenge|25": {
"acc": 0.3242320819112628,
"acc_stderr": 0.01367881039951882,
"acc_norm": 0.3575085324232082,
"acc_norm_stderr": 0.014005494275916571
},
"harness|hellaswag|10": {
"acc": 0.43069109739095796,
"acc_stderr": 0.004941609820763587,
"acc_norm": 0.5712009559848635,
"acc_norm_stderr": 0.004938930143234449
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37358490566037733,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.37358490566037733,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.03899073687357335,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.03899073687357335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.03656343653353157,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.03656343653353157
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.03047297336338005,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.03047297336338005
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4,
"acc_stderr": 0.02786932057166464,
"acc_norm": 0.4,
"acc_norm_stderr": 0.02786932057166464
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.03898531605579418,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.03898531605579418
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5025906735751295,
"acc_stderr": 0.03608390745384487,
"acc_norm": 0.5025906735751295,
"acc_norm_stderr": 0.03608390745384487
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4025641025641026,
"acc_stderr": 0.024864995159767755,
"acc_norm": 0.4025641025641026,
"acc_norm_stderr": 0.024864995159767755
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766118,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766118
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.44220183486238535,
"acc_stderr": 0.021293613207520212,
"acc_norm": 0.44220183486238535,
"acc_norm_stderr": 0.021293613207520212
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.0305467452649532,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.0305467452649532
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.510548523206751,
"acc_stderr": 0.032539983791662855,
"acc_norm": 0.510548523206751,
"acc_norm_stderr": 0.032539983791662855
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.42152466367713004,
"acc_stderr": 0.03314190222110658,
"acc_norm": 0.42152466367713004,
"acc_norm_stderr": 0.03314190222110658
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.512396694214876,
"acc_stderr": 0.045629515481807666,
"acc_norm": 0.512396694214876,
"acc_norm_stderr": 0.045629515481807666
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.36809815950920244,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.36809815950920244,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.032485775115784016,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.032485775115784016
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4840357598978289,
"acc_stderr": 0.017870847506081734,
"acc_norm": 0.4840357598978289,
"acc_norm_stderr": 0.017870847506081734
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.026636539741116086,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.026636539741116086
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.01424263007057491,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.01424263007057491
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.02833239748366427,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.02833239748366427
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.02801365189199507,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.02801365189199507
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.027339546640662724,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.027339546640662724
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.02699219917306436,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.02699219917306436
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3044328552803129,
"acc_stderr": 0.011752877592597574,
"acc_norm": 0.3044328552803129,
"acc_norm_stderr": 0.011752877592597574
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.01945076843250551,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.01945076843250551
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5323383084577115,
"acc_stderr": 0.03528131472933607,
"acc_norm": 0.5323383084577115,
"acc_norm_stderr": 0.03528131472933607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4093567251461988,
"acc_stderr": 0.03771283107626544,
"acc_norm": 0.4093567251461988,
"acc_norm_stderr": 0.03771283107626544
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570349,
"mc2": 0.38711002085151214,
"mc2_stderr": 0.014178729804966983
},
"harness|winogrande|5": {
"acc": 0.5911602209944752,
"acc_stderr": 0.01381695429513568
},
"harness|gsm8k|5": {
"acc": 0.0356330553449583,
"acc_stderr": 0.0051061078537441885
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AAAIMX/universities_in_merida | ---
license: mit
language:
- es
tags:
- university
- school
- aaaimx
size_categories:
- n<1K
--- |
jonathan-roberts1/RSD46-WHU | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': airplane
'1': airport
'2': artificial dense forest land
'3': artificial sparse forest land
'4': bare land
'5': basketball court
'6': blue structured factory building
'7': building
'8': construction site
'9': cross river bridge
'10': crossroads
'11': dense tall building
'12': dock
'13': fish pond
'14': footbridge
'15': graff
'16': grassland
'17': irregular farmland
'18': low scattered building
'19': medium density scattered building
'20': medium density structured building
'21': natural dense forest land
'22': natural sparse forest land
'23': oil tank
'24': overpass
'25': parking lot
'26': plastic greenhouse
'27': playground
'28': railway
'29': red structured factory building
'30': refinery
'31': regular farmland
'32': scattered blue roof factory building
'33': scattered red roof factory building
'34': sewage plant-type-one
'35': sewage plant-type-two
'36': ship
'37': solar power station
'38': sparse residential area
'39': square
'40': steelworks
'41': storage land
'42': tennis court
'43': thermal power plant
'44': vegetable plot
'45': water
splits:
- name: train
num_bytes: 1650045051.96
num_examples: 17516
download_size: 2184490825
dataset_size: 1650045051.96
license: other
---
# Dataset Card for "RSD46-WHU"
## Dataset Description
- **Paper** [Accurate Object Localization in Remote Sensing Images Based on Convolutional Neural Networks](https://ieeexplore.ieee.org/iel7/36/7880748/07827088.pdf)
- **Paper** [High-Resolution Remote Sensing Image Retrieval Based on CNNs from a Dimensional Perspective](https://www.mdpi.com/209338)
- **Split** Validation
## Split Information
This HuggingFace dataset repository contains just the Validation split.
### Licensing Information
[Free for education, research and commercial use.](https://github.com/RSIA-LIESMARS-WHU/RSD46-WHU)
## Citation Information
[Accurate Object Localization in Remote Sensing Images Based on Convolutional Neural Networks](https://ieeexplore.ieee.org/iel7/36/7880748/07827088.pdf)
[High-Resolution Remote Sensing Image Retrieval Based on CNNs from a Dimensional Perspective](https://www.mdpi.com/209338)
```
@article{long2017accurate,
title = {Accurate object localization in remote sensing images based on convolutional neural networks},
author = {Long, Yang and Gong, Yiping and Xiao, Zhifeng and Liu, Qing},
year = 2017,
journal = {IEEE Transactions on Geoscience and Remote Sensing},
publisher = {IEEE},
volume = 55,
number = 5,
pages = {2486--2498}
}
@article{xiao2017high,
title = {High-resolution remote sensing image retrieval based on CNNs from a dimensional perspective},
author = {Xiao, Zhifeng and Long, Yang and Li, Deren and Wei, Chunshan and Tang, Gefu and Liu, Junyi},
year = 2017,
journal = {Remote Sensing},
publisher = {MDPI},
volume = 9,
number = 7,
pages = 725
}
``` |
liyucheng/novel_metaphor | ---
dataset_info:
features:
- name: id
dtype: string
- name: words
sequence: string
- name: lemmas
sequence: string
- name: poses
sequence: string
- name: metaphor_classes
sequence: int64
- name: novel_score
sequence: float64
splits:
- name: train
num_bytes: 17600252
num_examples: 32036
download_size: 3437305
dataset_size: 17600252
---
# Dataset Card for "novel_metaphor"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nishimaki/taiyo | ---
license: openrail
---
|
Codec-SUPERB/Nsynth-test_synth | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
splits:
- name: original
num_bytes: 524610083.0
num_examples: 4096
- name: academicodec_hifi_16k_320d
num_bytes: 524619150.64
num_examples: 4096
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 524619150.64
num_examples: 4096
- name: academicodec_hifi_24k_320d
num_bytes: 786763150.64
num_examples: 4096
- name: audiodec_24k_320d
num_bytes: 786763662.64
num_examples: 4096
- name: dac_16k
num_bytes: 524619150.64
num_examples: 4096
- name: dac_24k
num_bytes: 786763150.64
num_examples: 4096
- name: dac_44k
num_bytes: 1445399950.64
num_examples: 4096
- name: encodec_24k_12bps
num_bytes: 786763150.64
num_examples: 4096
- name: encodec_24k_1_5bps
num_bytes: 786763150.64
num_examples: 4096
- name: encodec_24k_24bps
num_bytes: 786763150.64
num_examples: 4096
- name: encodec_24k_3bps
num_bytes: 786763150.64
num_examples: 4096
- name: encodec_24k_6bps
num_bytes: 786763150.64
num_examples: 4096
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 524619150.64
num_examples: 4096
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 524619150.64
num_examples: 4096
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 524619150.64
num_examples: 4096
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 524619150.64
num_examples: 4096
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 524619150.64
num_examples: 4096
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 524619150.64
num_examples: 4096
- name: speech_tokenizer_16k
num_bytes: 524619150.64
num_examples: 4096
download_size: 10584062989
dataset_size: 13510307257.159996
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
ovior/twitter_dataset_1713109473 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2671259
num_examples: 8177
download_size: 1504105
dataset_size: 2671259
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anujsahani01/MarEng_TransLoom | ---
license: mit
task_categories:
- translation
language:
- en
- mr
--- |
yleo/emerton_dpo_pairs_judge | ---
dataset_info:
features:
- name: system
dtype: string
- name: input
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: generations
sequence: string
- name: generation_model
sequence: string
- name: rating
sequence: float32
- name: chosen_judge
dtype: string
- name: rejected_judge
dtype: string
- name: chosen_judge_model
dtype: string
- name: rejected_judge_model
dtype: string
- name: rejected_judge_score
dtype: float64
- name: chosen_judge_score
dtype: float64
splits:
- name: train
num_bytes: 38173225
num_examples: 5489
download_size: 21529431
dataset_size: 38173225
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This dataset is consists in performing a judge on the answers from GPT4 and GTP4 Turbo.
It is the judge version of [yleo/emerton_dpo_pairs](https://huggingface.co/datasets/yleo/emerton_dpo_pairs)
To perform the judge, [llm-blender/PairRM](https://huggingface.co/llm-blender/PairRM) is used.
I recommend filtering on chosen_judge_score > 1 to keep only signicative gaps.
|
AdapterOcean/med_alpaca_standardized_cluster_19_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 17710103
num_examples: 11565
download_size: 9022105
dataset_size: 17710103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_19_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2 | ---
pretty_name: Evaluation run of nlpguy/Hermes-low-tune-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nlpguy/Hermes-low-tune-2](https://huggingface.co/nlpguy/Hermes-low-tune-2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T13:59:33.272174](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2/blob/main/results_2024-01-05T13-59-33.272174.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6389638892457566,\n\
\ \"acc_stderr\": 0.03228226820237424,\n \"acc_norm\": 0.6407807294820688,\n\
\ \"acc_norm_stderr\": 0.03292777968100128,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5318336325194422,\n\
\ \"mc2_stderr\": 0.01508871153008636\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670733,\n\
\ \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156213\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6512646883091018,\n\
\ \"acc_stderr\": 0.004755960559929163,\n \"acc_norm\": 0.8446524596693886,\n\
\ \"acc_norm_stderr\": 0.0036149536450656443\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800897,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800897\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n\
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n\
\ \"acc_stderr\": 0.015476515438005564,\n \"acc_norm\": 0.3106145251396648,\n\
\ \"acc_norm_stderr\": 0.015476515438005564\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"\
acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045702,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045702\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5318336325194422,\n\
\ \"mc2_stderr\": 0.01508871153008636\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712666\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6353297952994693,\n \
\ \"acc_stderr\": 0.013258428375662247\n }\n}\n```"
repo_url: https://huggingface.co/nlpguy/Hermes-low-tune-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|arc:challenge|25_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|arc:challenge|25_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|gsm8k|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|gsm8k|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hellaswag|10_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hellaswag|10_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T13-58-35.823625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T13-59-33.272174.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T13-59-33.272174.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- '**/details_harness|winogrande|5_2024-01-05T13-58-35.823625.parquet'
- split: 2024_01_05T13_59_33.272174
path:
- '**/details_harness|winogrande|5_2024-01-05T13-59-33.272174.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T13-59-33.272174.parquet'
- config_name: results
data_files:
- split: 2024_01_05T13_58_35.823625
path:
- results_2024-01-05T13-58-35.823625.parquet
- split: 2024_01_05T13_59_33.272174
path:
- results_2024-01-05T13-59-33.272174.parquet
- split: latest
path:
- results_2024-01-05T13-59-33.272174.parquet
---
# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/Hermes-low-tune-2](https://huggingface.co/nlpguy/Hermes-low-tune-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T13:59:33.272174](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2/blob/main/results_2024-01-05T13-59-33.272174.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6389638892457566,
"acc_stderr": 0.03228226820237424,
"acc_norm": 0.6407807294820688,
"acc_norm_stderr": 0.03292777968100128,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5318336325194422,
"mc2_stderr": 0.01508871153008636
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670733,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.6512646883091018,
"acc_stderr": 0.004755960559929163,
"acc_norm": 0.8446524596693886,
"acc_norm_stderr": 0.0036149536450656443
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800897,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800897
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005564,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005564
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045702,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045702
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5318336325194422,
"mc2_stderr": 0.01508871153008636
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712666
},
"harness|gsm8k|5": {
"acc": 0.6353297952994693,
"acc_stderr": 0.013258428375662247
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
LRudL/x_thinks_y | ---
dataset_info:
features:
- name: type
dtype: string
- name: false_part
dtype: string
- name: true_version_of_part
dtype: string
- name: entire_statement
dtype: string
splits:
- name: train
num_bytes: 25046
num_examples: 96
download_size: 18252
dataset_size: 25046
---
# Dataset Card for "x_thinks_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Samsoup/WNLI | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': not_entailment
'1': entailment
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 107109
num_examples: 635
- name: validation
num_bytes: 12162
num_examples: 71
- name: test
num_bytes: 37889
num_examples: 146
download_size: 63474
dataset_size: 157160
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "WNLI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ihaflix1/Drauziovirella | ---
license: openrail
---
|
pachi107/autotrain-data-ethos-sentiments | ---
language:
- en
task_categories:
- text-classification
---
# AutoTrain Dataset for project: ethos-sentiments
## Dataset Description
This dataset has been automatically processed by AutoTrain for project ethos-sentiments.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "well it's clear now why europeans can't differ niggers and shitskins from human",
"target": 0
},
{
"text": "These boys will then grow up with people making fun of them and they will then hate their parents for ruining their lives.",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(num_classes=2, names=['hate_speech', 'no_hate_speech'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 798 |
| valid | 200 |
|
AlekseyKorshuk/camel-chatml | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 226701847.0
num_examples: 110000
download_size: 106777582
dataset_size: 226701847.0
---
# Dataset Card for "camel-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lehrig/GTZAN-Collection | ---
license: apache-2.0
---
# Dataset Card for GTZAN Collection
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://github.com/derekahuang/Music-Classification
- **Repository:** https://github.com/derekahuang/Music-Classification
- **Paper:** [Musical genre classification of audio signals](https://ieeexplore.ieee.org/document/1021072)
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
The dataset consists of 1000 audio tracks each 30 seconds long.
It contains 10 genres, each represented by 100 tracks.
The tracks are all 22050Hz Mono 16-bit audio files in .wav format.
The genres are:
* blues
* classical
* country
* disco
* hiphop
* jazz
* metal
* pop
* reggae
* rock
This collection includes the following GTZAN variants:
* raw (original WAV files)
* melspectrograms (from each WAV file, contiguous 2-second windows at 4 random locations are sampled and transformed to Mel Spectrograms, resulting in 8000 Mel Spectrograms)
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
[Needs More Information]
## Dataset Structure
### Data Instances
[Needs More Information]
### Data Fields
[Needs More Information]
### Data Splits
[Needs More Information]
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
[Needs More Information] |
ekshat/text-2-sql-with-context | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 32317282.06065388
num_examples: 74648
- name: test
num_bytes: 1700977.939346119
num_examples: 3929
download_size: 8982199
dataset_size: 34018260.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "text-2-sql-with-context"
This dataset is prepared in Alpaca format introduced by Stanford to train LLMs. This dataset has been used in fine-tuning Chat Llama-2 7B. For more information, Please visit : https://huggingface.co/ekshat/Llama-2-7b-chat-finetune-for-text2sql |
medkit/simsamu | ---
language: fr
license: mit
multilinguality: monolingual
task_categories:
- automatic-speech-recognition
- voice-activity-detection
---
# Simsamu dataset
This repository contains recordings of simulated medical dispatch dialogs in the
french language, annotated for diarization and transcription. It is published
under the MIT license.
These dialogs were recorded as part of the training of emergency medicine
interns, which consisted in simulating a medical dispatch call where the interns
took turns playing the caller and the regulating doctor.
Each situation was decided randomly in advance, blind to who was playing the
medical dispatcher (e.g., road accident, chest pain, burns, etc.). The
affiliations between the caller and the patient (family, friend, colleague...)
and the caller's communication mode is then randomly selected. The caller had to
adapt his or her performance to the communication mode associated with the
situation. Seven communication modes were defined: shy, procedural, angry,
cooperative, frightened, impassive, incomprehensible.
Regarding sound quality, the voice of the regulating doctor is directly picked
up by a microphone, whereas the voice of the caller is transmitted through the
phone network and re-emitted by a phone speaker before being picked up by the
microphone. This leads to different acoustic characteristics between the
caller's voice and the regulator's, the later one often being much clearer. This
phenomena is also present in actual dispatch services recordings, where the
regulator's voice is directly recorded in a quiet room whereas the caller is
often calling from noisier environments and its voice is altered by the phone
network compression.
The dataset is composed of 61 audio recordings with a total duration of 3h 15
and an average duration per recording of 3 minutes 11 seconds. Each recording is
available as a `.m4a` audio file with 8KHz sample rate and a 128 Kbps bitrate.
The diarization data is available in a corresponding `.rttm` file and the
transcription in an `.srt` file.
An additional `metadata.csv` contains speaker ids for callers and regulators in
each recording.
See also: [Simsamu diarization
pipeline](https://huggingface.co/medkit/simsamu-diarization)
See also: [Simsamu transcription
model](https://huggingface.co/medkit/simsamu-transcription)
|
LambdaTests/VQAv2_sample_validation_benchmarks_partition_3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 58
num_examples: 2
download_size: 0
dataset_size: 58
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cokecigar/testgfox001 | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_rte_drop_copula_be_AP | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 68851
num_examples: 161
- name: train
num_bytes: 72469
num_examples: 161
download_size: 101656
dataset_size: 141320
---
# Dataset Card for "MULTI_VALUE_rte_drop_copula_be_AP"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JM-Lee/IFT_20240226 | ---
language:
- en
size_categories:
- 1K<n<10K
task_categories:
- text-generation
pretty_name: IFT
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 4600473
num_examples: 3829
- name: test
num_bytes: 510823
num_examples: 426
download_size: 15984713
dataset_size: 5111296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
jordiclive/FABSA | ---
dataset_info:
features:
- name: id
dtype: int64
- name: org_index
dtype: int64
- name: data_source
dtype: string
- name: industry
dtype: string
- name: text
dtype: string
- name: labels
sequence:
sequence: string
- name: label_codes
dtype: string
splits:
- name: train
num_bytes: 2599501.8469831664
num_examples: 7930
- name: validation
num_bytes: 346490.977586533
num_examples: 1057
- name: test
num_bytes: 520228.17543030076
num_examples: 1587
download_size: 1010316
dataset_size: 3466221.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# FABSA, An aspect-based sentiment analysis dataset in the Customer Feedback space (Trustpilot, Google Play and Apple Store reviews).
A professionally annotated dataset released by [Chattermill AI](https://chattermill.com/), with 8 years of experience in leveraging advanced ML analytics in the customer feedback space for high-profile clients such as Amazon and Uber.
Two annotators possess extensive experience in developing human-labeled ABSA datasets for commercial companies, while the third annotator holds a PhD in computational linguistics.
## Task
This dataset encompasses **Aspect Category Sentiment Analysis** and is suitable for both **Aspect Category Detection** (ACD) and **Aspect Category Sentiment Classification** (ACSC). ACD in sentiment analysis identifies aspect categories mentioned in a sentence. These categories are conceptual; they may not explicitly appear in the review and are chosen from a predefined list of Aspect Categories. ACSC classifies the sentiment polarities of these conceptual aspect categories.
The predefined list of Aspect Categories for this dataset are:
```
category category_code
0 Account management: Account access account-management.account-access
1 Company brand: Competitor company-brand.competitor
2 Company brand: General satisfaction company-brand.general-satisfaction
3 Company brand: Reviews company-brand.reviews
4 Logistics rides: Speed logistics-rides.speed
5 Online experience: App website online-experience.app-website
6 Purchase booking experience: Ease of use purchase-booking-experience.ease-of-use
7 Staff support: Attitude of staff staff-support.attitude-of-staff
8 Staff support: Email staff-support.email
9 Staff support: Phone staff-support.phone
10 Value: Discounts promotions value.discounts-promotions
11 Value: Price value for money value.price-value-for-money
```
## Annotation Scheme
The FABSA dataset is manually labeled according to a hierarchical annotation scheme that includes parent and child aspect categories. Each aspect category comes with an associated sentiment label (positive, negative, and neutral), creating a total of (12 × 3) target classification categories.
In line with prior studies, we employ a multi-label classification scheme where each review is tagged with one or more aspect + sentiment labels. Thus, a single review can cover multiple aspects and express various (sometimes contrasting) polarities. For example,
```
Customer Review:
“product is very good but customer service is really bad, they never respond”
Labels: [Company brand: General satisfaction, positive), (Staff support: Attitude of staff, Negative)"
```

## Release
There has been a lack of high-quality ABSA datasets covering broad domains and addressing real-world applications. Academic progress has been confined to benchmarking on domain-specific, toy datasets such as restaurants and laptops, which are limited in size (e.g., [SemEval Task ABSA](https://aclanthology.org/S16-1002.pdf) or [SentiHood](https://aclanthology.org/C16-1146/)).
This dataset is part of the [FABSA paper](https://www.sciencedirect.com/science/article/pii/S0925231223009906), and we release it hoping to advance academic progress as tools for ingesting and analyzing customer feedback at scale improve significantly, yet evaluation datasets continue to lag. FABSA is a new, large-scale, multi-domain ABSA dataset of feedback reviews, consisting of approximately 10,500 reviews spanning 10 domains (Fashion, Consulting, Travel Booking, Ride-hailing, Banking, Trading, Streaming, Price Comparison, Information Technology, and Groceries).
## Citation
```
@article{KONTONATSIOS2023126867,
title = {FABSA: An aspect-based sentiment analysis dataset of user reviews},
journal = {Neurocomputing},
volume = {562},
pages = {126867},
year = {2023},
issn = {0925-2312},
doi = {https://doi.org/10.1016/j.neucom.2023.126867},
url = {https://www.sciencedirect.com/science/article/pii/S0925231223009906},
author = {Georgios Kontonatsios and Jordan Clive and Georgia Harrison and Thomas Metcalfe and Patrycja Sliwiak and Hassan Tahir and Aji Ghose},
keywords = {ABSA, Multi-domain dataset, Deep learning},
}
``` |
DynamicSuperbPrivate/SpoofDetection_Asvspoof2017 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 270119397.248
num_examples: 3014
- name: validation
num_bytes: 169603853.98
num_examples: 1710
download_size: 415434782
dataset_size: 439723251.22800004
---
# Dataset Card for "SpoofDetection_ASVspoof2017"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ezipe/lichess_2023_janoct_shards | ---
license: apache-2.0
---
## Installation
```
pip install datasets numpy zstandard
```
## Usage
i've given up on trying to get this to work natively with huggingface datasets. this is nice because it allows streaming (https://huggingface.co/docs/datasets/en/about_mapstyle_vs_iterable) and has functions like 'map' which easily parallelize operations over the dataset. maybe i'll try to get this working in the future, but for now it gets stuck as the download and extracted zstd files are not decompressible for some reason, and it has to rewrite the entire dataset into arrow first.
<!-- ```
from datasets import load_dataset
dataset = load_dataset("ezipe/lichess_2023_janoct_shards")
``` -->
or more slowly,
```
kconda install git-lfs
git lfs install
git clone https://huggingface.co/datasets/ezipe/lichess_2023_janoct_shards/
```
|
liuyanchen1015/MULTI_VALUE_cola_his_he | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1437
num_examples: 18
- name: test
num_bytes: 2195
num_examples: 30
- name: train
num_bytes: 22627
num_examples: 301
download_size: 19026
dataset_size: 26259
---
# Dataset Card for "MULTI_VALUE_cola_his_he"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Heralax/Augmental-Dataset | ---
license: unknown
---
# A High-Quality AI Augmented Dataset for RP and conversation
This dataset is comprised of lines from the Visual Novel Steins;Gate, which have been filtered, reformatted, AI-rewritten (many of them twice), and in a few cases, manually quality checked.
The flagship model of this dataset (a finetune on top of MythoMax) can be found [here](https://huggingface.co/Heralax/Augmental-13b)!
It contains a large number of RP-focused, multiturn conversational training examples, from the perspectives of multiple characters.
The "Scenario" column (AI-generated) describes the context behind the conversation in which a line takes place.
The "Completion" column (human-written, AI-enhanced), is the content of the line in that row
The "Speaker" column contains the name of the speaker of that line, and is one of ["Okabe", "Kurisu","Faris", "Luka","Itaru","Suzuha","Mayuri"]
The "History" column contains the lines that come before the completion, in a given conversation. There will always be at least one.
See the repo with the dataset generation code here https://github.com/e-p-armstrong/amadeus (train.py and make_card_evanchat.py) for an example of how these might be formatted for training.
# IMPORTANT NOTE
I'll be honest: I don't know where this dataset sits legally. On one hand, the source text was substantially revised over the course of costly weeks of experimentation; the source is not included in its entirety; and the source has been used by other AI creators before with the knowledge of the publisher of the source materia. Finally there's good precedent for using copyrighted work in AI training (CommonCrawl contains plenty of copyrighted work, for instance. Big companies use copyrighted data all the time).
On the other hand, this dataset is entirely composed of text that was generated using copyrighted text as an input.
This all being considered, I've left the license field of the model BLANK and will take the dataset down without any questions asked if the legal holders of the copyright ask me to do so. I hope they don't ask me to do so, because this is a project I'm really proud of, and I think it could be useful to the open source AI RP community. If someone from MAGES is reading this, then I ask them, as someone who's bought nearly every SciAdv game from C;H to A;C, please don't kill it. Thank you.
|
orgcatorg/multilingual | ---
dataset_info:
- config_name: eng_Latn-ben_Beng
features:
- name: translation
struct:
- name: ben_Beng
dtype: string
- name: eng_Latn
dtype: string
splits:
- name: train
num_bytes: 861347100
num_examples: 3807057
download_size: 457359684
dataset_size: 861347100
- config_name: eng_Latn-hin_Deva
features:
- name: translation
struct:
- name: eng_Latn
dtype: string
- name: hin_Deva
dtype: string
splits:
- name: train
num_bytes: 1835061414
num_examples: 5525375
download_size: 966770811
dataset_size: 1835061414
- config_name: eng_Latn-lao_Laoo
features:
- name: translation
struct:
- name: eng_Latn
dtype: string
- name: lao_Laoo
dtype: string
splits:
- name: train
num_bytes: 42871606
num_examples: 140265
download_size: 23468883
dataset_size: 42871606
- config_name: eng_Latn-mya_Mymr
features:
- name: translation
struct:
- name: eng_Latn
dtype: string
- name: mya_Mymr
dtype: string
splits:
- name: train
num_bytes: 70235556
num_examples: 248767
download_size: 34667809
dataset_size: 70235556
- config_name: eng_Latn-tgl_Latn
features:
- name: translation
struct:
- name: eng_Latn
dtype: string
- name: tgl_Latn
dtype: string
splits:
- name: train
num_bytes: 860044626
num_examples: 4335174
download_size: 602646732
dataset_size: 860044626
- config_name: eng_Latn-tha_Thai
features:
- name: translation
struct:
- name: eng_Latn
dtype: string
- name: tha_Thai
dtype: string
splits:
- name: train
num_bytes: 433620969
num_examples: 1388326
download_size: 231017202
dataset_size: 433620969
- config_name: eng_Latn-vie_Latn
features:
- name: translation
struct:
- name: eng_Latn
dtype: string
- name: vie_Latn
dtype: string
splits:
- name: train
num_bytes: 2088114876
num_examples: 8742176
download_size: 1386936411
dataset_size: 2088114876
- config_name: eng_Latn-zsm_Latn
features:
- name: translation
struct:
- name: eng_Latn
dtype: string
- name: zsm_Latn
dtype: string
splits:
- name: train
num_bytes: 1665180036
num_examples: 6279419
download_size: 1123124266
dataset_size: 1665180036
configs:
- config_name: eng_Latn-ben_Beng
data_files:
- split: train
path: eng_Latn-ben_Beng/train-*
- config_name: eng_Latn-hin_Deva
data_files:
- split: train
path: eng_Latn-hin_Deva/train-*
- config_name: eng_Latn-lao_Laoo
data_files:
- split: train
path: eng_Latn-lao_Laoo/train-*
- config_name: eng_Latn-mya_Mymr
data_files:
- split: train
path: eng_Latn-mya_Mymr/train-*
- config_name: eng_Latn-tgl_Latn
data_files:
- split: train
path: eng_Latn-tgl_Latn/train-*
- config_name: eng_Latn-tha_Thai
data_files:
- split: train
path: eng_Latn-tha_Thai/train-*
- config_name: eng_Latn-vie_Latn
data_files:
- split: train
path: eng_Latn-vie_Latn/train-*
- config_name: eng_Latn-zsm_Latn
data_files:
- split: train
path: eng_Latn-zsm_Latn/train-*
---
# Dataset Card for "multilingual"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_50_1713167291 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 294389
num_examples: 747
download_size: 153841
dataset_size: 294389
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dragon4926/orca-cpy | ---
dataset_info:
features:
- name: samples
struct:
- name: inputs
dtype: string
splits:
- name: train
num_bytes: 3084
num_examples: 6
download_size: 5360
dataset_size: 3084
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
globis-university/aozorabunko-chats | ---
license: cc-by-4.0
task_categories:
- text-generation
- text-classification
language:
- ja
size_categories:
- 100K<n<1M
---
# Overview
This dataset is of conversations extracted from [Aozora Bunko (青空文庫)](https://www.aozora.gr.jp/), which collects public-domain books in Japan, using a simple heuristic approach.
[For Japanese] 日本語での概要説明を Qiita に記載しました: https://qiita.com/akeyhero/items/b53eae1c0bc4d54e321f
# Method
First, lines surrounded by quotation mark pairs (`「」`) are extracted as utterances from the `text` field of [globis-university/aozorabunko-clean](https://huggingface.co/datasets/globis-university/aozorabunko-clean).
Then, consecutive utterances are collected and grouped together.
The code to reproduce this dataset is made available on GitHub: [globis-org/aozorabunko-exctractor](https://github.com/globis-org/aozorabunko-extractor).
# Notice
As the conversations are extracted using a simple heuristic, a certain amount of the data may actually be monologues.
# Tips
If you prefer to employ only modern Japanese, you can filter entries with: `row["meta"]["文字遣い種別"] == "新字新仮名"`.
# Example
```py
>>> from datasets import load_dataset
>>> ds = load_dataset('globis-university/aozorabunko-chats')
>>> ds
DatasetDict({
train: Dataset({
features: ['chats', 'footnote', 'meta'],
num_rows: 5531
})
})
>>> ds = ds.filter(lambda row: row['meta']['文字遣い種別'] == '新字新仮名') # only modern Japanese
>>> ds
DatasetDict({
train: Dataset({
features: ['chats', 'footnote', 'meta'],
num_rows: 4139
})
})
>>> book = ds['train'][0] # one of the works
>>> book['meta']['作品名']
'スリーピー・ホローの伝説'
>>> chats = book['chats'] # list of the chats in the work; type: list[list[str]]
>>> len(chats)
1
>>> chat = chats[0] # one of the chats; type: list[str]
>>> for utterance in chat:
... print(utterance)
...
人生においては、たとえどんな場合でも必ず利点や愉快なことがあるはずです。もっともそれは、わたくしどもが冗談をすなおに受けとればのことですが
そこで、悪魔の騎士と競走することになった人は、とかくめちゃくちゃに走るのも当然です
したがって、田舎の学校の先生がオランダ人の世継ぎ娘に結婚を拒まれるということは、彼にとっては、世の中で栄進出世にいたるたしかな一歩だということになります
```
# License
CC BY 4.0 |
gracebwu/amazon-toys | ---
license: unknown
---
|
one-sec-cv12/chunk_201 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 18388197504.0
num_examples: 191448
download_size: 16635752940
dataset_size: 18388197504.0
---
# Dataset Card for "chunk_201"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deprem-private/intent_test_v13_anonymized | ---
dataset_info:
features:
- name: image_url
dtype: string
- name: label
sequence: string
- name: label_confidence
sequence: float64
- name: labeler
dtype: string
- name: label_creation_time
dtype: int64
splits:
- name: train
num_bytes: 588460
num_examples: 2028
download_size: 313656
dataset_size: 588460
---
# Dataset Card for "intent_test_v13_anonymized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SQexplorer/SQ | ---
license: openrail
---
|
PapaObi/TCWStyle | ---
license: openrail
---
|
parseny/DCS_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 163160222
num_examples: 1145003
download_size: 30946202
dataset_size: 163160222
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "DCS_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
saibo/bookcorpus_compact_1024_shard1_of_10 | ---
dataset_info:
features:
- name: text
dtype: string
- name: concept_with_offset
dtype: string
splits:
- name: train
num_bytes: 733627676
num_examples: 61605
download_size: 367870833
dataset_size: 733627676
---
# Dataset Card for "bookcorpus_compact_1024_shard1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TeamSODA/cl-signal_processing_attacks_whisper_librispeech | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': 0-original
'1': 1-attacked
splits:
- name: train
num_bytes: 13751864078
num_examples: 18000
download_size: 910820595
dataset_size: 13751864078
license: openrail
task_categories:
- audio-classification
language:
- en
pretty_name: SodaSP
size_categories:
- 1K<n<10K
---
# Dataset Card for "cl-signal_processing_attacks_large"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xreborn/ds3 | ---
license: apache-2.0
---
|
distilled-from-one-sec-cv12/chunk_67 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1218300876
num_examples: 237393
download_size: 1243586424
dataset_size: 1218300876
---
# Dataset Card for "chunk_67"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fcuadra/MarketMail_AI | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 53877
num_examples: 50
download_size: 37010
dataset_size: 53877
---
# Dataset Card for "MarketMail_AI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
coeuslearning/product_ads_c | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 5006
num_examples: 25
download_size: 6203
dataset_size: 5006
license: openrail
task_categories:
- text-generation
language:
- en
tags:
- art
pretty_name: Product Ads Current
size_categories:
- 1K<n<10K
---
# Dataset Card for "product_ads_c"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qgallouedec/prj_gia_dataset_metaworld_button_press_topdown_wall_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the button-press-topdown-wall-v2 environment, sample for the policy button-press-topdown-wall-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_button_press_topdown_wall_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_button_press_topdown_wall_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
dim/essayforum_raw_writing_10k | ---
license: mit
dataset_info:
features:
- name: message
dtype: string
- name: author
dtype: string
- name: date
dtype: string
- name: position
dtype: int64
- name: url
dtype: string
- name: forum_type
dtype: string
splits:
- name: train
num_bytes: 39625264
num_examples: 29604
download_size: 19976186
dataset_size: 39625264
---
|
Falah/village4kids_0_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2702
num_examples: 10
download_size: 4036
dataset_size: 2702
---
# Dataset Card for "village4kids_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dmitriy007/Lenta_2_v2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 699990424
num_examples: 51334
- name: validation
num_bytes: 69734504
num_examples: 5114
- name: test
num_bytes: 74438924
num_examples: 5459
download_size: 265249928
dataset_size: 844163852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful | ---
pretty_name: Evaluation run of eren23/OGNO-7b-dpo-truthful
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/OGNO-7b-dpo-truthful](https://huggingface.co/eren23/OGNO-7b-dpo-truthful)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T22:05:00.872209](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful/blob/main/results_2024-02-16T22-05-00.872209.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.652348491009575,\n\
\ \"acc_stderr\": 0.03197996047742033,\n \"acc_norm\": 0.6516420684516023,\n\
\ \"acc_norm_stderr\": 0.03264932973384798,\n \"mc1\": 0.6242350061199511,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7660822976380632,\n\
\ \"mc2_stderr\": 0.013995111777693896\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7150965943039235,\n\
\ \"acc_stderr\": 0.004504459553909766,\n \"acc_norm\": 0.890161322445728,\n\
\ \"acc_norm_stderr\": 0.0031204952388275576\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\
\ \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n\
\ \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6242350061199511,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7660822976380632,\n\
\ \"mc2_stderr\": 0.013995111777693896\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272955\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \
\ \"acc_stderr\": 0.012740305717376268\n }\n}\n```"
repo_url: https://huggingface.co/eren23/OGNO-7b-dpo-truthful
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|arc:challenge|25_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|gsm8k|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hellaswag|10_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T22-05-00.872209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T22-05-00.872209.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- '**/details_harness|winogrande|5_2024-02-16T22-05-00.872209.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T22-05-00.872209.parquet'
- config_name: results
data_files:
- split: 2024_02_16T22_05_00.872209
path:
- results_2024-02-16T22-05-00.872209.parquet
- split: latest
path:
- results_2024-02-16T22-05-00.872209.parquet
---
# Dataset Card for Evaluation run of eren23/OGNO-7b-dpo-truthful
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/OGNO-7b-dpo-truthful](https://huggingface.co/eren23/OGNO-7b-dpo-truthful) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T22:05:00.872209](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__OGNO-7b-dpo-truthful/blob/main/results_2024-02-16T22-05-00.872209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.652348491009575,
"acc_stderr": 0.03197996047742033,
"acc_norm": 0.6516420684516023,
"acc_norm_stderr": 0.03264932973384798,
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7660822976380632,
"mc2_stderr": 0.013995111777693896
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.013203196088537372,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7150965943039235,
"acc_stderr": 0.004504459553909766,
"acc_norm": 0.890161322445728,
"acc_norm_stderr": 0.0031204952388275576
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7660822976380632,
"mc2_stderr": 0.013995111777693896
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272955
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.012740305717376268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_244 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 932686364
num_examples: 183167
download_size: 951703928
dataset_size: 932686364
---
# Dataset Card for "chunk_244"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ossaili/test_01 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 102096.0
num_examples: 1
download_size: 103703
dataset_size: 102096.0
---
# Dataset Card for "test_01"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/Caltech101_not_background_test_facebook_opt_1.3b_Attributes_Caption_ns_5647_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 85893289.125
num_examples: 5647
- name: fewshot_3_bs_16
num_bytes: 88897029.125
num_examples: 5647
download_size: 168050181
dataset_size: 174790318.25
---
# Dataset Card for "Caltech101_not_background_test_facebook_opt_1.3b_Attributes_Caption_ns_5647_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
spacemanidol/summary-enhanced-msmarco-passage | ---
license: apache-2.0
---
|
malteee/SynTuckPlatform | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
- name: bbox
sequence: float64
splits:
- name: train
num_bytes: 167539793.0
num_examples: 169
download_size: 98942920
dataset_size: 167539793.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SynTuckPlatform"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-miscellaneous-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 4914
num_examples: 5
- name: test
num_bytes: 3633674
num_examples: 783
download_size: 451301
dataset_size: 3638588
---
# Dataset Card for "mmlu-miscellaneous-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yvillamil/long-data-collection-finetune-50k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3269978812
num_examples: 50000
download_size: 1811572919
dataset_size: 3269978812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
language:
- en
---
# Dataset Card for "long-data-collection-finetune-50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
The dataset is a 50k row collection of the finetuning subset created by togethercomputer and which can be found at the following URL https://huggingface.co/datasets/togethercomputer/Long-Data-Collections in the fine-tune path
The exercise consisted of taking the data set and being able to set the format for finetuning llama2 with the aim of setting only one column (**text**), with the full format.
Additionally, highlight the use of the code structure that can be found in the following repository https://github.com/MuhammadMoinFaisal/LargeLanguageModelsProjects/tree/main for this purpose. |
deep-plants/AGM | ---
license: cc
size_categories:
- 100K<n<1M
task_categories:
- image-classification
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 3208126820.734
num_examples: 972858
download_size: 3245813213
dataset_size: 3208126820.734
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for AGM Dataset
## Dataset Summary
The AGM (AGricolaModerna) Dataset is a comprehensive collection of high-resolution RGB images capturing harvest-ready plants in a vertical farm setting. This dataset consists of 972,858 images, each with a resolution of 120x120 pixels, covering 18 different plant crops. In the context of this dataset, a crop refers to a plant species or a mix of plant species.
## Supported Tasks
Image classification: plant phenotyping
## Languages
The dataset primarily consists of image data and does not involve language content. Therefore, the primary language is English, but it is not relevant to the dataset's core content.
## Dataset Structure
### Data Instances
A typical data instance from the training set consists of the following:
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=120x120 at 0x29CEAD71780>,
'crop_type': 'by'
}
```
### Data Fields
The dataset's data instances have the following fields:
- `image`: A PIL.Image.Image object representing the image.
- `crop_type`: An string representation of the crop type in the image
### Data Splits
- **Training Set**:
- Number of Examples: 972,858
## Dataset Creation
### Curation Rationale
The creation of the AGM Dataset was motivated by the need for a large and diverse dataset that captures various aspects of modern agriculture, including plant species diversity, stress detection, and crop health assessment.
### Source Data
#### Initial Data Collection and Normalization
The images were captured using a high-resolution camera positioned above a moving table in an agricultural setting. The camera captured images of the entire table, which was filled with trays of harvested crops. The image capture process spanned from May 2022 to December 2022. The original images had a resolution of $1073{\times}650$ pixels. Each pixel in the images corresponds to a physical size of $0.5$ millimeters.
### Annotations
#### Annotation Process
Agronomists and domain experts were involved in the annotation process. They annotated each image to identify the crops present and assign them to specific categories or species. This annotation process involved labeling each image with one of 18 distinct crop categories, which include individual plant species and mixtures of species.
### Who Are the Annotators?
The annotators are agronomists employed by Agricola Moderna.
## Personal and Sensitive Information
The dataset does not contain personal or sensitive information about individuals. It primarily consists of images of plants.
## Considerations for Using the Data
### Social Impact of Dataset
The AGM Dataset has potential social impact in modern agriculture and related domains. It can advance agriculture by aiding the development of innovative technologies for crop monitoring, disease detection, and yield prediction, fostering sustainable farming practices, contributing to food security and ensuring higher agricultural productivity and affordability. The dataset supports research for environmentally sustainable agriculture, optimizing resource use and reducing environmental impact.
### Discussion of Biases and Known Limitations
The dataset primarily involves images from a single vertical farm setting therefore, while massive, includes relatively little variation in crop types. The dataset's contents and annotations may reflect regional agricultural practices and preferences. Business preferences also play a substantial role in determining the types of crops grown in vertical farms. These preferences, often influenced by market demand and profitability, can significantly differ from conventional open-air field agriculture. Therefore, the dataset may inherently reflect these business-driven crop choices, potentially affecting its representativeness of broader agricultural scenarios.
## Additional Information
### Dataset Curators
The dataset is curate by DeepPlants and AgricolaModerna. You can contact us for further informations at
nico@deepplants.com
etienne.david@agricolamoderna.com
### Licensing Information
### Citation Information
If you use the AGM dataset in your work, please consider citing the following publication:
```bibtex
@InProceedings{Sama_2023_ICCV,
author = {Sama, Nico and David, Etienne and Rossetti, Simone and Antona, Alessandro and Franchetti, Benjamin and Pirri, Fiora},
title = {A new Large Dataset and a Transfer Learning Methodology for Plant Phenotyping in Vertical Farms},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
month = {October},
year = {2023},
pages = {540-551}
}
``` |
result-kand2-sdxl-wuerst-karlo/39ceeb6b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 173
num_examples: 10
download_size: 1318
dataset_size: 173
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "39ceeb6b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KushT/trec_question_classification_split | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 307501
num_examples: 4906
- name: validation
num_bytes: 33969
num_examples: 546
- name: test
num_bytes: 23979
num_examples: 500
download_size: 220019
dataset_size: 365449
---
|
JinglesDados/MarciaGomes | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.