datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Meohong/Judgement_dataset | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_cola_present_for_exp_perfect | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1586
num_examples: 16
- name: test
num_bytes: 2468
num_examples: 30
- name: train
num_bytes: 17835
num_examples: 253
download_size: 16314
dataset_size: 21889
---
# Dataset Card for "MULTI_VALUE_cola_present_for_exp_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/3D_Face_Recognition_Images_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/3D_Face_Recognition_Images_Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1093?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
5,199 People โ 3D Face Recognition Images Data. The collection scene is indoor scene. The dataset includes males and females. The age distribution ranges from juvenile to the elderly, the young people and the middle aged are the majorities. The device includes iPhone X, iPhone XR. The data diversity includes multiple facial postures, multiple light conditions, multiple indoor scenes. This data can be used for tasks such as 3D face recognition.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1093?source=Huggingface
### Supported Tasks and Leaderboards
face-detection, computer-vision: The dataset can be used to train a model for face detection.
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
AlvianKhairi/my-pandas-dataset-Abstract_No_Link_25k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 30020414
num_examples: 25000
download_size: 14958534
dataset_size: 30020414
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my-pandas-dataset-Abstract_No_Link_25k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Katsurades/LoRa | ---
license: other
---
|
liuyanchen1015/MULTI_VALUE_cola_drop_copula_be_AP | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3332
num_examples: 42
- name: test
num_bytes: 3660
num_examples: 48
- name: train
num_bytes: 25894
num_examples: 378
download_size: 20817
dataset_size: 32886
---
# Dataset Card for "MULTI_VALUE_cola_drop_copula_be_AP"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_it_is_referential | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 30264
num_examples: 167
- name: test
num_bytes: 389888
num_examples: 1969
- name: train
num_bytes: 326137
num_examples: 1632
download_size: 462937
dataset_size: 746289
---
# Dataset Card for "MULTI_VALUE_qqp_it_is_referential"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
0xk1h0/Py150k-vuln-scanned | ---
license: mit
---
|
msklar/skribbl-drawings | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1263264.0
num_examples: 304
download_size: 1043652
dataset_size: 1263264.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fangyuan/longform_sciqa | ---
license: cc
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for Long-form-sci-qa
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Paper:** [๐ฅKIWI: A Dataset of Knowledge-Intensive Writing Instructions for Answering Research Questions](https://www.cs.utexas.edu/~fxu/kiwi/kiwi_paper.pdf)
- **Point of Contact:** fangyuan[at]utexas.edu
### Dataset Summary
This dataset contains the question and document pairs annotated for ๐ฅKIWI.
### Languages
The dataset contains data in English.
## Dataset Structure
### Data Instances
Each instance is a question, paired with the related work paragraph for which the question is written and a set of relevant papers (cited in the related work paragraph).
### Data Fields
Each instance contains the following fields:
* `question`: the input question *q*
* `related_work_paragraph`: the related work paragraph that the annotator wrote the question for.
* `cited_papers`: The list of papers that are relevant to the question.
* `cited_paragraphs`: The list of extracted paragraphs from the cited papers.
## Dataset Creation
Please refer to our [paper](https://arxiv.org/pdf/2403.03866.pdf) (Section 3.1) for details on annotation process and discussion on limitations.
## Additional Information
Please checkout [this dataset](https://huggingface.co/datasets/fangyuan/kiwi) for the interaction data collected.
### Licensing Information
https://creativecommons.org/licenses/by-sa/4.0/legalcode
### Citation Information
```
@article{xu2024kiwi,
title = {KIWI: A Dataset of Knowledge-Intensive Writing Instructions for Answering Research Questions},
author = {Xu, Fangyuan and Lo, Kyle and Kuehl, Bailey and Soldaini, Luca and Choi, Eunsol and Wadden, David},
year = 2024,
}
``` |
bagu/topik2 | ---
license: llama2
---
|
lamini/alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 27364517
num_examples: 52002
download_size: 12742513
dataset_size: 27364517
---
# Dataset Card for "alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dvilasuero/multiturner-for-generation | ---
dataset_info:
features:
- name: source
dtype: string
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen-rating
dtype: float64
- name: chosen-model
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected-rating
dtype: float64
- name: rejected-model
dtype: string
- name: input
dtype: string
- name: generation_model
dtype: string
- name: generation_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: followup
sequence: string
splits:
- name: train
num_bytes: 34132394
num_examples: 3431
download_size: 17508262
dataset_size: 34132394
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "multiturner-for-generation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_39 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1309958396
num_examples: 255253
download_size: 1331632913
dataset_size: 1309958396
---
# Dataset Card for "chunk_39"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/dcase2016_task2_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: id
dtype: string
splits:
- name: original
num_bytes: 829448008.0
num_examples: 72
- name: academicodec_hifi_16k_320d
num_bytes: 276485559.0
num_examples: 72
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 276485559.0
num_examples: 72
- name: academicodec_hifi_24k_320d
num_bytes: 414725559.0
num_examples: 72
- name: audiodec_24k_320d
num_bytes: 414725559.0
num_examples: 72
- name: dac_16k
num_bytes: 276485559.0
num_examples: 72
- name: dac_24k
num_bytes: 414725559.0
num_examples: 72
- name: dac_44k
num_bytes: 762053559.0
num_examples: 72
- name: encodec_24k
num_bytes: 414725703.0
num_examples: 72
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 276485703.0
num_examples: 72
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 276485703.0
num_examples: 72
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 276485703.0
num_examples: 72
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 276485703.0
num_examples: 72
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 276485703.0
num_examples: 72
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 276485703.0
num_examples: 72
- name: speech_tokenizer_16k
num_bytes: 276531639.0
num_examples: 72
download_size: 6009102140
dataset_size: 6015306481.0
---
# Dataset Card for "dcase2016_task2_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
muhtasham/autonlp-data-Doctor_DE | ---
language:
- de
task_categories:
- text-classification
task_ids:
- text-scoring
---
# AutoNLP Dataset for project: Doctor_DE
## Table of content
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
## Dataset Descritpion
This dataset has been automatically processed by AutoNLP for project Doctor_DE.
### Languages
The BCP-47 code for the dataset's language is de.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "Ich bin nun seit ca 12 Jahren Patientin in dieser Praxis und kann einige der Kommentare hier ehrlich gesagt \u00fcberhaupt nicht nachvollziehen.<br />\nFr. Dr. Gr\u00f6ber Pohl ist in meinen Augen eine unglaublich nette und kompetente \u00c4rztin. Ich kenne in meinem Familien- und Bekanntenkreis viele die bei ihr in Behandlung sind, und alle sind sehr zufrieden!<br />\nSie nimmt sich immer viel Zeit und auch in meiner Schwangerschaft habe ich mich bei ihr immer gut versorgt gef\u00fchlt, und musste daf\u00fcr kein einziges Mal in die Tasche greifen!<br />\nDas einzig negative ist die lange Wartezeit in der Praxis. Daf\u00fcr nimmt sie sich aber auch Zeit und arbeitet nicht wie andere \u00c4rzte wie am Flie\u00dfband.<br />\nIch kann sie nur weiter empfehlen!",
"target": 1.0
},
{
"text": "Ich hatte nie den Eindruck \"Der N\u00e4chste bitte\" Er hatte sofort meine Beschwerden erkannt und Abhilfe geschafft.",
"target": 1.0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='float32', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 280191 |
| valid | 70050 |
|
michaelnath/annotated-code-functions-teensy | ---
dataset_info:
features:
- name: function
dtype: string
- name: repo_name
dtype: string
- name: features
sequence: float64
splits:
- name: train
num_bytes: 454721
num_examples: 1001
download_size: 152815
dataset_size: 454721
---
# Dataset Card for "annotated-code-functions-teensy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
biglam/early_printed_books_with_multiple_font_groups | Invalid username or password. |
laion/OIG-riverbed-filtered-small | ---
license: apache-2.0
---
## OIG-riverbed-filtered-small
A small filtered version of https://huggingface.co/datasets/laion/OIG, used for experimenting with filtering, clustering and visualizing the data in the OIG dataset.
```
'unatural_instructions': 33110,
'ul2_plus_oscar_en_00300': 32439,
'infil_dbpedia': 81913,
'synth_qa': 11056,
'rallio': 192978,
'unifiedskg': 34692,
'soda_dialog': 40233,
'merged_code_xp3': 50000,
'laion_image_prompts': 10572,
'oscar_en_00000': 27713,
'prosocial': 14543,
'xp3_sample': 38833,
'mathqa': 10674,
'anthrop_helpful': 4816,
'flanv2_cot_qed_train': 2370,
'dahoas': 27534,
'flanv2_cot_esnli_train': 11325,
'flanv2_cot_creak_train': 1809,
'conala': 2669,
'synth_code': 3206,
'flanv2_cot_gsm8k_train': 3356,
'kojma_cot': 3135,
'essays': 1531,
'plot_screenplay_books': 8135,
'safety_image_prompt': 5001,
'synth_depression': 944,
'cuad': 497,
'flanv2_cot_sensemaking_train': 1589,
'anthrop_redteam': 1142,
'flanv2_cot_strategyqa_train': 357,
'flanv2_cot_ecqa_train': 2862,
'flanv2_cot_aqua_train': 1157,
'flanv2_cot_qasc_train': 516,
'wiki_toxic_nontoxic': 103
```
It is best to download the data directly instead of using HF load_datasets: https://huggingface.co/datasets/laion/OIG-riverbed-filtered-small/resolve/main/OIG_filtered.jsonl
Topic map for a subset of the data:
```
.
โโMilitary Documents and Sentences____
โ โโBurlington County Images and Employment Chart____
โ โ โโReports and assessments on risk analysis, data management, and industrial hygiene standards for tran
โ โ โ โโNCHRP Reports for Transportation Management and Planning____
โ โ โ โ โโโ โโGuidelines for Design of Corrosion-Damaged Bridge Superstructure with nchrp report____ โโ Topic: 70
โ โ โ โ โโโ โโTransportation and Risk Management Practices____ โโ Topic: 16
โ โ โ โโAssessment and Review of Chemical Agent Destruction Pilot Plants for Waste Disposal____
โ โ โ โโโ โโSpace Science and Technology Documents____ โโ Topic: 18
โ โ โ โโWaste disposal, chemical agent destruction pilot plants review and assessment____
โ โ โ โโโ โโChemical Agent Destruction Pilot Plant Review and Assessment____ โโ Topic: 48
โ โ โ โโโ โโRadioactive Waste Disposal and Alternative Treatments: Review of Various Documents____ โโ Topic: 45
โ โ โโHighlighted Townships in Burlington County, NJ with Inset Maps____
โ โ โโTownship Highlighted in Burlington County - Image Prompts____
โ โ โ โโTownship Highlighting in Burlington County, New Jersey____
โ โ โ โ โโโ โโImages and landmarks of Jonesboro, Waycross, and their respective counties in Arkansas and Georgia__ โโ Topic: 97
โ โ โ โ โโโ โโHighlighted townships in Burlington County, New Jersey____ โโ Topic: 26
โ โ โ โโImages and Workshop Summary Documents Related to Gulf Research Program, Mississippi River Basin Wate
โ โ โ โโGeography and Environmental Issues in North and South America____
โ โ โ โ โโโ โโArctic Sea Ice Extent and Stability Analysis____ โโ Topic: 96
โ โ โ โ โโDocument topics related to rivers, maps, disasters, and ecosystem resilience in southern and eastern
โ โ โ โ โโGeographic features in North and South America including rivers, watersheds, and ecosystems____
โ โ โ โ โ โโโ โโImage Prompts for Cartographic Maps of Various Locations including South America, New Zealand, and t โโ Topic: 19
โ โ โ โ โ โโโ โโRivers and Ecosystems: Research and Monitoring in the Gulf and Oregon Watersheds____ โโ Topic: 15
โ โ โ โ โโโ โโDisaster Resilience and Preparedness: Perspectives from Workshops and Case Studies____ โโ Topic: 41
โ โ โ โโGeologic Survey and Cartographic Image Prompt with State Summary Information and District Boundaries
โ โ โ โโโ โโGeological surveys and information on mineral resources in state parks____ โโ Topic: 93
โ โ โ โโโ โโDrawings of geological features and caves including limestone, Coronado, and Pirinexus challenges___ โโ Topic: 49
โ โ โโGraphical analysis of energy-related costs and trends over time____
โ โ โโGraphs showing changes in U.S. employment, gasoline prices, energy expenditures, and employment cost
โ โ โ โโโ โโClimate assessment and biogeochemical cycles with precipitation trends and image prompts.____ โโ Topic: 56
โ โ โ โโGraphs showing changes in employment, natural gas electric generating capacity, and residential elec
โ โ โ โโโ โโEnergy Trends and Prices____ โโ Topic: 6
โ โ โ โโโ โโEmployment Trends in Selected Metropolitan Areas____ โโ Topic: 3
โ โ โโโ โโOccupational injuries and illnesses rates and incidents in selected state and local government indus โโ Topic: 63
โ โโMilitary operations and command updates in 2019____
โ โโHealth-related Workshop Proceedings and Image Prompts for Cancer, HIV, Health Literacy, and Accounti
โ โ โโโ โโPromoting mental and behavioral health through effective therapy and preventive strategies____ โโ Topic: 84
โ โ โโHealth and healthcare approaches for various illnesses and conditions, including COVID-19, HIV, and
โ โ โโHealth care and COVID-19: Cancer treatment, innovation, and accounting approaches in low-resource ar
โ โ โ โโโ โโImages related to Covid-19 vaccination and prevention____ โโ Topic: 10
โ โ โ โโโ โโHealth Policy Workshop Proceedings and Image Covers for Cancer, Workforce, Literacy, and Accounting โโ Topic: 4
โ โ โโโ โโAnnual Medical College Announcements in Philadelphia and Pennsylvania____ โโ Topic: 67
โ โโAir Force documents covering various topics such as commanders, training, veterans, and change of co
โ โโAir Force Operations and Training Documents____
โ โ โโโ โโDocuments related to military training and operations of Marine Corps and Army forces in 2013, 2017, โโ Topic: 2
โ โ โโโ โโImages of Air Force Change of Command Ceremonies____ โโ Topic: 1
โ โโImages and news related to the United States Defense Secretary James Mattis____
โ โโImages of Defense Secretary James Mattis in Meetings and Testimonies.____
โ โ โโโ โโImages related to Russian and Kazakh politicians including Solzhenitsyn, Yanukovych, Navalny, Aliyev โโ Topic: 88
โ โ โโโ โโImages featuring Defense Secretary James Mattis in official meetings and events.____ โโ Topic: 7
โ โโPolice and Law Enforcement Activities and Incidents in Various Canadian Cities____
โ โโโ โโProtests and Demonstrations with Activists and Slogans____ โโ Topic: 32
โ โโโ โโPolice activities in Windsor and Victoria, including commendations, dedicated flag, and crime scene โโ Topic: 22
โโClipart use for teaching materials with unlimited access for Abcteach members____
โโPrintable worksheets for kindergarten learning and coloring pages with image prompts and sensory act
โ โโSports Image Prompts featuring Football, Cricket, Hockey and American Football Players____
โ โ โโVarious Image Prompts for Drawings Related to Football and Valparaiso Football Season Disappointment
โ โ โ โโโ โโValparaiso Crusaders Football Season____ โโ Topic: 106
โ โ โ โโSports-themed image prompts and podcast related to football and basketball.____
โ โ โ โโโ โโSports images with football players and throwback Riddell helmets____ โโ Topic: 13
โ โ โ โโโ โโBasketball and Kardashian-West sightings____ โโ Topic: 23
โ โ โโIndian Cricket - Matches, Images, and Cheer____
โ โ โโIndian Premier League and Cricket Matches____
โ โ โ โโโ โโSoccer Matches and Teams in Various Leagues____ โโ Topic: 12
โ โ โ โโโ โโCricket matches and fans in India, featuring IPL teams Kings XI Punjab and Kolkata Knight Riders, Bo โโ Topic: 9
โ โ โโOlympic awards and championships won by athletes in various sports and attended by celebrities, with
โ โ โโOlympics-related Image and Sentence Prompts____
โ โ โ โโโ โโImages of Ricky Gervais, Jennifer Aniston, and Rachel Brosnahan at various award shows in Beverly Hi โโ Topic: 8
โ โ โ โโโ โโOlympic Medals and Champions in Tennis____ โโ Topic: 21
โ โ โโโ โโMixed Combat Sports Culture with Boxing and Wrestling Championships____ โโ Topic: 64
โ โโCollection of Printable Worksheets for Kindergarten and Grade Levels____
โ โโDesigning school hooded sweatshirts with super-soft cotton/poly fleece____
โ โ โโCollection of Robert Dennis Stereoscopic Views - Image Drawing Prompts____
โ โ โ โโImages of Greek and Byzantine Empires under different dynasties in ancient and medieval times, inclu
โ โ โ โ โโImages of the Byzantine Empire under various dynasties and territories in ancient times____
โ โ โ โ โ โโโ โโCultural highlights of Asia - temples, burial complexes, and historic figures____ โโ Topic: 78
โ โ โ โ โ โโByzantine Empire under various dynasties and territories in medieval times____
โ โ โ โ โ โโโ โโArt and History Document Prompts____ โโ Topic: 47
โ โ โ โ โ โโโ โโByzantine Empire and its Dynasties with Territory and Depicted Borders____ โโ Topic: 90
โ โ โ โ โโโ โโGreek language and culture in modern and ancient Greece with various areas, religions, and historica โโ Topic: 89
โ โ โ โโRobert Dennis Stereoscopic Views image prompts and stability prompt____
โ โ โ โโCollection of Stereoscopic Views by Robert Dennis for Image Prompts____
โ โ โ โ โโLamborghini and Ducati Racing Footage and Newsreels in Different Battle Zones and Civil Wars____
โ โ โ โ โ โโLamborghini and Motorsports - Super Trofeo Races and Co-Branded Collections____
โ โ โ โ โ โ โโโ โโMotorsports and Racing with Schumacher, Capps, and more____ โโ Topic: 74
โ โ โ โ โ โ โโโ โโLamborghini Super Trofeo and Automobili Celebrations____ โโ Topic: 43
โ โ โ โ โ โโImages prompts for various topics including Civil War, New York Stock Exchange, and Star Wars: Dange
โ โ โ โ โ โโHistorical events and footage related to battles, wars, and stock exchange in different countries an
โ โ โ โ โ โ โโโ โโVarious Battles and War-related Topics____ โโ Topic: 27
โ โ โ โ โ โ โโโ โโStock newsreel videos of historical events in New York City____ โโ Topic: 68
โ โ โ โ โ โโKorean dramas with various casts and release dates, including "Window," "One Ordinary Day," "NCT 24h
โ โ โ โ โ โโโ โโImage prompts for various topics including Pickett N901-ES simplex slide rule, October issue of Peac โโ Topic: 108
โ โ โ โ โ โโโ โโKorean dramas and their cast and release dates, featuring "Racket Boys", "One Ordinary Day", "NCT 24 โโ Topic: 104
โ โ โ โ โโStereoscopic views by Robert Dennis collection - Image prompts for drawing____
โ โ โ โ โโRetirement Banquet at Centennial Student Union, Mankato State University____
โ โ โ โ โ โโReligion and Faith Diversity in Infographics, Icons, and Images.____
โ โ โ โ โ โ โโโ โโReligious Icons and Symbols____ โโ Topic: 38
โ โ โ โ โ โ โโโ โโImages and Infographics Related to Muslims and Islamophobia____ โโ Topic: 61
โ โ โ โ โ โโRetirement Banquet at Mankato State University in June with Awards and Speakers____
โ โ โ โ โ โโRetirement banquet at Centennial Student Union, Mankato State University with awards and speakers.__
โ โ โ โ โ โ โโImages of African Union, Women's History Month, and Indigenous Tribes____
โ โ โ โ โ โ โ โโโ โโCelebrating Women's History Month and Women's Rights Activists____ โโ Topic: 98
โ โ โ โ โ โ โ โโIndigenous and African Union Partnerships in Tradition and Development____
โ โ โ โ โ โ โ โโโ โโCultural Traditions and Sovereignty of Indigenous Peoples____ โโ Topic: 29
โ โ โ โ โ โ โ โโโ โโAfrican Union Partnership for COVID-19 Media Outreach and Prevention____ โโ Topic: 30
โ โ โ โ โ โ โโRetirement Banquet at Centennial Student Union, Mankato State University____
โ โ โ โ โ โ โโโ โโRetirement Banquet Image Prompts at Mankato State University's Centennial Student Union, June____ โโ Topic: 35
โ โ โ โ โ โ โโโ โโQueensland State Archives images of Brisbane and surrounding areas in 1930s____ โโ Topic: 100
โ โ โ โ โ โโDrawings of Skyscrapers and Landmarks in Indianapolis and Tampa____
โ โ โ โ โ โโโ โโImages and documents related to Smithsonian, Whitney Museum of American Art, and Panama-Pacific Inte โโ Topic: 58
โ โ โ โ โ โโโ โโCity Skylines - Indianapolis and Tampa____ โโ Topic: 87
โ โ โ โ โโImage prompts from Robert N. Dennis Collection of Stereoscopic Views for drawing different sceneries
โ โ โ โ โโImage prompts from Robert N. Dennis collection of stereoscopic views____
โ โ โ โ โ โโImages of animals and their interactions with humans in various settings____
โ โ โ โ โ โ โโโ โโWildlife Encounters and Human Interactions in National Parks____ โโ Topic: 110
โ โ โ โ โ โ โโโ โโImages of dogs and their owners in various settings and activities (e.g. training, running marathons โโ Topic: 91
โ โ โ โ โ โโImage prompts from Robert Dennis Collection of Stereoscopic Views____
โ โ โ โ โ โโGarden Image Prompts - Wellington, Marengo, Sutton Place, Unidentified____
โ โ โ โ โ โ โโโ โโImages of Gardens and Outdoor Decor____ โโ Topic: 14
โ โ โ โ โ โ โโโ โโWellington Real Estate Auctions and Subdivisions (with Cartographic Material)____ โโ Topic: 46
โ โ โ โ โ โโRobert Dennis Stereoscopic Views Collection Images of Cities and Landscapes____
โ โ โ โ โ โโโ โโStereoscopic Views from Robert Dennis Collection of Various Cities____ โโ Topic: 17
โ โ โ โ โ โโโ โโVarious Cathedrals and Landmarks in Different Locations____ โโ Topic: 73
โ โ โ โ โโRoyal Family Events and Visits____
โ โ โ โ โโโ โโRoyal Events and Visits including Duchess of Cornwall, Duke and Duchess of Cambridge, Prince Harry, โโ Topic: 40
โ โ โ โ โโโ โโFunerals and Coffins of Victims in Ireland____ โโ Topic: 113
โ โ โ โโMemes, Invention, Archery, Politics, and Corrupt Politicians____
โ โ โ โโโ โโMemes, aliens, politics, and corruption in Minnesota____ โโ Topic: 116
โ โ โ โโโ โโDrawing memes related to inventions, archery, and humor.____ โโ Topic: 39
โ โ โโInvoice Design and Template Assistance for Pepperdine University Community____
โ โ โโInvoice and template design for Pepperdine University with remarkable dashboard and software feature
โ โ โ โโRTCA 2013 Brochure Provides Information on Current Telecom Projects and Successes____
โ โ โ โ โโVC Funds and Industry Analysis for Funding Rounds in Asia and Location Based Services during Recent
โ โ โ โ โ โโโ โโVC Funds and Funding Rounds in Various Industries____ โโ Topic: 71
โ โ โ โ โ โโโ โโDrawings of blockchain and cryptocurrency-related events and companies, including Rio DeFi, Healthur โโ Topic: 51
โ โ โ โ โโRTCA 2013 Brochure Provides Information on Current Telecommunications Projects and Recent Successes_
โ โ โ โ โโTechnology Innovation and Communications Engineering____
โ โ โ โ โ โโโ โโInnovation, Patents, Technology, Leadership, and Impact Trends____ โโ Topic: 75
โ โ โ โ โ โโโ โโTechnology and Engineering Strategies for Outsourcing Telecommunications and Business Analytics____ โโ Topic: 28
โ โ โ โ โโโ โโrtca 2013 brochure provides current news and successes of projects____ โโ Topic: 52
โ โ โ โโInvoicing and Dashboard Templates for Pepperdine University and Business Use____
โ โ โ โโWiring Diagrams for Home and Office____
โ โ โ โ โโWiring Diagrams for Home and Office____
โ โ โ โ โ โโWiring Diagrams for Manufacturing and Repair____
โ โ โ โ โ โ โโSmall Machine Shop Owners' Reactions to Automated Manufacturing Research Facility____
โ โ โ โ โ โ โ โโโ โโImage prompts for small machine shop owners' reactions to automated manufacturing and CNC machine to โโ Topic: 101
โ โ โ โ โ โ โ โโโ โโFord maintenance and repair manuals for various vehicle models including Bronco, Festiva, Ranger, an โโ Topic: 53
โ โ โ โ โ โ โโElectrical Wiring Diagrams for Various Applications____
โ โ โ โ โ โ โโโ โโElectrical Wiring Diagrams for Various Applications____ โโ Topic: 25
โ โ โ โ โ โ โโโ โโVarious Images of Trains and Railways____ โโ Topic: 69
โ โ โ โ โ โโFurniture and Dining Table Ideas____
โ โ โ โ โ โโโ โโVarious Lighting Ideas and Products for Different Settings and Purposes____ โโ Topic: 65
โ โ โ โ โ โโโ โโDrawing furniture and dining table sets with linen fabric and mahogany finish from image prompts.___ โโ Topic: 31
โ โ โ โ โโRoofing Companies and Services____
โ โ โ โ โโโ โโProduct Recalls and Safety Hazards____ โโ Topic: 99
โ โ โ โ โโRoofing Companies and Services in Various Locations____
โ โ โ โ โโโ โโWildland Firefighters and Fire Management Research____ โโ Topic: 94
โ โ โ โ โโโ โโRoofing Contractors and Services in Various Locations with Additional Related Keywords.____ โโ Topic: 77
โ โ โ โโInvoice Design and Approval at Pepperdine University's Community Platform.____
โ โ โ โโInvoice design template and approval process at Pepperdine University's communitypepperdineedu____
โ โ โ โ โโPepperdine University invoice design and approval process____
โ โ โ โ โ โโInvoice Template and Dashboard for Pepperdine University and Business Invoicing with Free Downloads
โ โ โ โ โ โ โโโ โโPepperdine University Invoice Templates____ โโ Topic: 44
โ โ โ โ โ โ โโโ โโImage prompts for drawing related to business, contracts, and agreements____ โโ Topic: 92
โ โ โ โ โ โโLogo design entries in a contest for various businesses____
โ โ โ โ โ โโLogo Design Contest Entries____
โ โ โ โ โ โ โโDell EMC IT Certifications and Technologies____
โ โ โ โ โ โ โ โโIllustration, Mendelian Genetics, and Irwaddy Dolphin's Skeleton in Museo di Storia Naturale____
โ โ โ โ โ โ โ โ โโโ โโGenetic mechanisms and structure of prokaryotic and eukaryotic cells____ โโ Topic: 42
โ โ โ โ โ โ โ โ โโโ โโImages of natural history specimens exhibited in museums with keywords including irrawaddy dolphin, โโ Topic: 111
โ โ โ โ โ โ โ โโโ โโDell EMC certification exam image prompts for networking, cloud infrastructure and services, and pow โโ Topic: 54
โ โ โ โ โ โ โโLogo Design Contest Entries for Various Businesses____
โ โ โ โ โ โ โโโ โโLogo Design Contest Entries____ โโ Topic: 36
โ โ โ โ โ โ โโโ โโLegal Documents Management for Law Firms and Corporate Legal Departments____ โโ Topic: 86
โ โ โ โ โ โโEnglish subtitles download for various movies and shows____
โ โ โ โ โ โโCannabis and Biome Grow Companies, Health Effects and Business Plan for Investors, and Doc Hollidaze
โ โ โ โ โ โ โโโ โโCannabis Business and Cultivation featuring Highland Grow and Doc Hollidaze Premium Cannabis____ โโ Topic: 107
โ โ โ โ โ โ โโโ โโBoard Results and Examinations across India (CBSE, ICSE, Punjab, Rajasthan, Tripura) including Suppl โโ Topic: 117
โ โ โ โ โ โโโ โโSubtitles Download for English Movies____ โโ Topic: 105
โ โ โ โ โโNews and Reading Platforms for Sangrur and Barnala in Punjabi Jagran in 2014 for iPad, iPhone, and S
โ โ โ โ โโโ โโSmartphone brands and features - iPhone, Samsung Galaxy, Nokia launches and latest updates.____ โโ Topic: 34
โ โ โ โ โโโ โโNews and e-paper articles in Punjabi and Hindi for Sangrur and Barnala on tablets and smartphones in โโ Topic: 85
โ โ โ โโCyberbullying Prevention and Tactics____
โ โ โ โโImage prompts for drawing based on podcast episodes with plugins for managing work and product grids
โ โ โ โ โโโ โโDrawing image prompts for podcast episodes with various plugin features____ โโ Topic: 79
โ โ โ โ โโโ โโVarious topics related to Zuckerberg, Facebook, and related events____ โโ Topic: 115
โ โ โ โโCyberbullying Prevention Tactics with iPredator and Michael Nuccitelli____
โ โ โ โโInvestment in Gold and Silver Commodities on Comex by Managed Money with Speculative Positioning, ba
โ โ โ โ โโโ โโComex Managed Money Speculative Positions on Gold and Silver Futures and Options____ โโ Topic: 95
โ โ โ โ โโโ โโVarious Diamond Jewelry Images and Prompts____ โโ Topic: 82
โ โ โ โโCyberbullying Prevention Tactics____
โ โ โ โโโ โโScams and Romance - Protecting Yourself Online and Offline____ โโ Topic: 102
โ โ โ โโโ โโCyberbullying Prevention and Tactics through iPredator and Michael Nuccitelli's Work____ โโ Topic: 83
โ โ โโSchool Hooded Sweatshirts in Super-Soft Cotton/Poly Fleece____
โ โ โโEssays on Christmas, Macbeth, Romeo and Juliet with image prompts____
โ โ โ โโEnglish literature analysis and critical essays on Macbeth, Romeo and Juliet, and The Catcher in the
โ โ โ โ โโโ โโPoetry and Essays by Various Famous Poets____ โโ Topic: 76
โ โ โ โ โโโ โโEssays on Shakespeare's Macbeth and Romeo and Juliet, as well as other literary works____ โโ Topic: 37
โ โ โ โโPrice comparison of books on various subjects at popular e-commerce sites, and image prompts for dra
โ โ โ โโPrice comparison of educational books on mathematics and psychology at popular online bookstores____
โ โ โ โ โโPrice comparison and edition analysis for various subjects on Flipkart, Amazon, and other online boo
โ โ โ โ โ โโPrice comparison of books on mathematics and psychology across major online retailers including Flip
โ โ โ โ โ โ โโโ โโDisney, Fantasyland, Imagineers, Walter Elias Disney, gravestone, plaque, inscribed.____ โโ Topic: 80
โ โ โ โ โ โ โโโ โโPrice comparison for books on mathematics and psychology on various platforms (Flipkart, Amazon, etc โโ Topic: 60
โ โ โ โ โ โโImage prompts for physical fitness and science discussions____
โ โ โ โ โ โโFitness, Mathematics, and Environmental Changes - Image Prompts for Physical Activity and Scientific
โ โ โ โ โ โ โโโ โโFitness and Workout Mats โ Improving Aerobic Fitness and Health with Non-Slip Exercise Mats____ โโ Topic: 81
โ โ โ โ โ โ โโโ โโVisual prompts and dialogues on changing physical and mathematical environments, coursework on nichr โโ Topic: 72
โ โ โ โ โ โโโ โโCareer Pathways and Development Resources____ โโ Topic: 114
โ โ โ โ โโChristmas-themed Preschool Activity Ideas____
โ โ โ โ โโChristmas preschool theme with image prompts and activities incorporating alphabet, counting, and ho
โ โ โ โ โ โโโ โโChristmas themed image prompts, online shopping, and holiday traditions____ โโ Topic: 55
โ โ โ โ โ โโโ โโPreschool Alphabet Book Crafts____ โโ Topic: 59
โ โ โ โ โโโ โโBook covers and adventure novels____ โโ Topic: 66
โ โ โ โโNutrition and Food Images and Prompts____
โ โ โ โโAgriculture and Horticulture Topics in Georgia, including Cotton, Wheat, and Greenhouse Management__
โ โ โ โ โโโ โโAgricultural innovations and community support in Holland, Burkina Faso, and Uganda____ โโ Topic: 118
โ โ โ โ โโโ โโWheat crop and horticulture practices____ โโ Topic: 62
โ โ โ โโNutrition and Food Science____
โ โ โ โโโ โโCanned meats and fruits, stability and quality symposium, and image prompts for food drawings____ โโ Topic: 33
โ โ โ โโโ โโNutrition and Image Prompts____ โโ Topic: 57
โ โ โโSchool Hooded Cotton Sweatshirts with Super-Soft Fleece____
โ โ โโDress and Fashion Image Prompts, Medieval Fantasy Paper Dresses, and Elegant Tulle Prom Dresses____
โ โ โ โโโ โโFantasy Paper Doll Outfits and Image Prompts____ โโ Topic: 103
โ โ โ โโโ โโFashion and Dresses Inspiration____ โโ Topic: 11
โ โ โโHigh school hooded sweatshirts in super-soft cotton/poly fleece____
โ โ โโโ โโHigh School Hooded Sweatshirts - Super Soft Cotton/Poly Fleece to Keep You Warm on the Sidelines____ โโ Topic: 20
โ โ โโโ โโHigh School Racerback Tank Tops with District Threads____ โโ Topic: 109
โ โโWorksheets and Printables for Math, Kindergarten, and Beyond____
โ โโMath and Reading Worksheets for Grades 7-9____
โ โ โโโ โโPrintable worksheets for math, reading, and kindergarten learning with image prompts.____ โโ Topic: 5
โ โ โโโ โโMath Games and Activities for Engaging Students in Homeschool and Classroom Settings____ โโ Topic: 112
โ โโโ โโColoring Pages for Various Themes and Sizes____ โโ Topic: 24
โโClipart use for teaching materials with unlimited access for members on abcteach____
โโโ โโClipart use for teaching materials in commercial format with unlimited illustrations as an abcteach โโ Topic: 0
โโโ โโFlags Clipart for Teaching with Abcteach Membership____ โโ Topic: 50
```
Thanks to LAION volunteers: @Rallio, @Kenhktsui, @Danielpatrickhug @Vyprix and @Summer.
|
CVasNLPExperiments/Food101_test_facebook_opt_350m_mode_T_SPECIFIC_A_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_descriptors_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 36249
num_examples: 100
download_size: 8450
dataset_size: 36249
---
# Dataset Card for "Food101_test_facebook_opt_350m_mode_T_SPECIFIC_A_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malhajar/winogrande-tr | ---
language:
- tr
paperswithcode_id: winogrande
pretty_name: WinoGrande
dataset_info:
- config_name: winogrande_xs
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 20704
num_examples: 160
- name: test
num_bytes: 227649
num_examples: 1767
- name: validation
num_bytes: 164199
num_examples: 1267
download_size: 3395492
dataset_size: 412552
- config_name: winogrande_s
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 82308
num_examples: 640
- name: test
num_bytes: 227649
num_examples: 1767
- name: validation
num_bytes: 164199
num_examples: 1267
download_size: 3395492
dataset_size: 474156
- config_name: winogrande_m
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 329001
num_examples: 2558
- name: test
num_bytes: 227649
num_examples: 1767
- name: validation
num_bytes: 164199
num_examples: 1267
download_size: 3395492
dataset_size: 720849
- config_name: winogrande_l
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 1319576
num_examples: 10234
- name: test
num_bytes: 227649
num_examples: 1767
- name: validation
num_bytes: 164199
num_examples: 1267
download_size: 3395492
dataset_size: 1711424
- config_name: winogrande_xl
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 5185832
num_examples: 40398
- name: test
num_bytes: 227649
num_examples: 1767
- name: validation
num_bytes: 164199
num_examples: 1267
download_size: 3395492
dataset_size: 5577680
- config_name: winogrande_debiased
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 1203420
num_examples: 9248
- name: test
num_bytes: 227649
num_examples: 1767
- name: validation
num_bytes: 164199
num_examples: 1267
download_size: 3395492
dataset_size: 1595268
configs:
- config_name: winogrande_debiased
data_files:
- split: train
path: winogrande_debiased/*_train-*
- split: test
path: winogrande_debiased/*_test-*
- split: validation
path: winogrande_debiased/*_validation-*
- config_name: winogrande_m
data_files:
- split: train
path: winogrande_m/winogrande_m_train-*
- split: test
path: winogrande_m/winogrande_m_test-*
- split: validation
path: winogrande_m/winogrande_m_validation-*
license: apache-2.0
---
# Dataset Card for "winogrande"
This Dataset is part of a series of datasets aimed at advancing Turkish LLM Developments by establishing rigid Turkish benchmarks to evaluate the performance of LLM's Produced in the Turkish Language.
malhajar/winogrande-tr is a translated version of [`winogrande`]( https://huggingface.co/datasets/winogrande) aimed specifically to be used in the [`OpenLLMTurkishLeaderboard`](https://huggingface.co/spaces/malhajar/OpenLLMTurkishLeaderboard)
**Translated by:** [`Mohamad Alhajar`](https://www.linkedin.com/in/muhammet-alhajar/)
### Dataset Summary
WinoGrande is a new collection of 44k problems, inspired by Winograd Schema Challenge (Levesque, Davis, and Morgenstern
2011), but adjusted to improve the scale and robustness against the dataset-specific bias. Formulated as a
fill-in-a-blank task with binary options, the goal is to choose the right option for a given sentence which requires
commonsense reasoning.
### Supported Tasks and Leaderboards
aimed specifically to be used in the [`OpenLLMTurkishLeaderboard`](https://huggingface.co/spaces/malhajar/OpenLLMTurkishLeaderboard)
### Languages
Turkish
## Dataset Structure
### Data Instances
#### winogrande_debiased
- **Size of downloaded dataset files:** 3.40 MB
- **Size of the generated dataset:** 1.59 MB
- **Total amount of disk used:** 4.99 MB
An example of 'train' looks as follows.
```
```
#### winogrande_l
- **Size of downloaded dataset files:** 3.40 MB
- **Size of the generated dataset:** 1.71 MB
- **Total amount of disk used:** 5.11 MB
An example of 'validation' looks as follows.
```
```
#### winogrande_m
- **Size of downloaded dataset files:** 3.40 MB
- **Size of the generated dataset:** 0.72 MB
- **Total amount of disk used:** 4.12 MB
An example of 'validation' looks as follows.
```
```
#### winogrande_s
- **Size of downloaded dataset files:** 3.40 MB
- **Size of the generated dataset:** 0.47 MB
- **Total amount of disk used:** 3.87 MB
An example of 'validation' looks as follows.
```
```
#### winogrande_xl
- **Size of downloaded dataset files:** 3.40 MB
- **Size of the generated dataset:** 5.58 MB
- **Total amount of disk used:** 8.98 MB
An example of 'train' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### winogrande_debiased
- `sentence`: a `string` feature.
- `option1`: a `string` feature.
- `option2`: a `string` feature.
- `answer`: a `string` feature.
#### winogrande_l
- `sentence`: a `string` feature.
- `option1`: a `string` feature.
- `option2`: a `string` feature.
- `answer`: a `string` feature.
#### winogrande_m
- `sentence`: a `string` feature.
- `option1`: a `string` feature.
- `option2`: a `string` feature.
- `answer`: a `string` feature.
#### winogrande_s
- `sentence`: a `string` feature.
- `option1`: a `string` feature.
- `option2`: a `string` feature.
- `answer`: a `string` feature.
#### winogrande_xl
- `sentence`: a `string` feature.
- `option1`: a `string` feature.
- `option2`: a `string` feature.
- `answer`: a `string` feature.
### Data Splits
| name |train|validation|test|
|-------------------|----:|---------:|---:|
|winogrande_debiased| 9248| 1267|1767|
|winogrande_l |10234| 1267|1767|
|winogrande_m | 2558| 1267|1767|
|winogrande_s | 640| 1267|1767|
|winogrande_xl |40398| 1267|1767|
|winogrande_xs | 160| 1267|1767|
### Citation Information
```
@InProceedings{ai2:winogrande,
title = {WinoGrande: An Adversarial Winograd Schema Challenge at Scale},
authors={Keisuke, Sakaguchi and Ronan, Le Bras and Chandra, Bhagavatula and Yejin, Choi
},
year={2019}
}
` |
PJMixers/hieunguyenminh_roleplay-ShareGPT | ---
language:
- en
source_datasets: hieunguyenminh/roleplay
---
# Changes
1. Reformatted into ShareGPT.
2. Removed the few duplicate characters.
3. Removed samples with no messages.
4. Any messages where the character is talking in third person is fixed to be first person, as it should be. |
Fredithefish/Pronoun-Rich-Conversations | ---
language:
- en
---
## Example Data for "Mastering Pronoun Resolution in Conversational Models" |
imoxto/sampleEvalData | ---
language:
- en
--- |
nath720/stabco | ---
license: openrail
---
|
DeepFoldProtein/2022-12-17-pdb-intersect-pisces_pc30_r2.5 | ---
dataset_info:
features:
- name: pdb_id
dtype: string
- name: chain_code
dtype: string
- name: seq
dtype: string
- name: sst8
dtype: string
- name: sst3
dtype: string
- name: len_x
dtype: int64
- name: has_nonstd_aa
dtype: bool
- name: len_y
dtype: int64
- name: method
dtype: string
- name: resol
dtype: float64
- name: rfac
dtype: float64
- name: freerfac
dtype: float64
splits:
- name: train
num_bytes: 12398412
num_examples: 15079
download_size: 6886024
dataset_size: 12398412
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
okbenzene2002/hindi-dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 26780164
num_examples: 637
download_size: 5030309
dataset_size: 26780164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosprime/twitter_dataset_1713059852 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 12691
num_examples: 28
download_size: 9885
dataset_size: 12691
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713059852"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
israfelsr/tokenized_cc3m | ---
language:
- en
license: mit
size_categories:
- 1M<n<10M
task_categories:
- text-generation
pretty_name: CLIP and T5 tokenization of CC3M
dataset_info:
features:
- name: text
dtype: string
- name: clip_ids
sequence: int64
- name: clip_attention_mask
sequence: int64
- name: t5_ids
sequence: int64
- name: t5_attention_mask
sequence: int64
splits:
- name: train
num_bytes: 31520132297
num_examples: 3318333
- name: validation
num_bytes: 150459428
num_examples: 15840
download_size: 362821979
dataset_size: 31670591725
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# CLIP and T5 tokenization of CC3M |
lmg-anon/VNTL-v3.1-1k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: text
dtype: string
- name: ignore_loss
sequence: int64
splits:
- name: train
num_bytes: 31045416
num_examples: 12903
- name: val
num_bytes: 3872937
num_examples: 1639
download_size: 15766667
dataset_size: 34918353
---
# Dataset Card for "VNTL-v3.1-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Eric111__CatunaMayo | ---
pretty_name: Evaluation run of eric111/CatunaMayo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eric111/CatunaMayo](https://huggingface.co/eric111/CatunaMayo) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eric111__CatunaMayo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T12:19:10.948803](https://huggingface.co/datasets/open-llm-leaderboard/details_eric111__CatunaMayo/blob/main/results_2024-02-22T12-19-10.948803.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6575531238421152,\n\
\ \"acc_stderr\": 0.031937472435679695,\n \"acc_norm\": 0.6570510728428876,\n\
\ \"acc_norm_stderr\": 0.03260291830758222,\n \"mc1\": 0.5507955936352509,\n\
\ \"mc1_stderr\": 0.0174129419861153,\n \"mc2\": 0.6996030299637043,\n\
\ \"mc2_stderr\": 0.014680491046789042\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6953924914675768,\n \"acc_stderr\": 0.01344952210993249,\n\
\ \"acc_norm\": 0.7175767918088737,\n \"acc_norm_stderr\": 0.013155456884097224\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6970722963553077,\n\
\ \"acc_stderr\": 0.004585850835623563,\n \"acc_norm\": 0.879008165704043,\n\
\ \"acc_norm_stderr\": 0.0032545129328064\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.02415222596280158,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.02415222596280158\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n\
\ \"acc_stderr\": 0.016583881958602394,\n \"acc_norm\": 0.43575418994413406,\n\
\ \"acc_norm_stderr\": 0.016583881958602394\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658537,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658537\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5507955936352509,\n\
\ \"mc1_stderr\": 0.0174129419861153,\n \"mc2\": 0.6996030299637043,\n\
\ \"mc2_stderr\": 0.014680491046789042\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498437\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7232752084912812,\n \
\ \"acc_stderr\": 0.012323047397959795\n }\n}\n```"
repo_url: https://huggingface.co/eric111/CatunaMayo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|arc:challenge|25_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|gsm8k|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hellaswag|10_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T00-21-24.620953.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-19-10.948803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-19-10.948803.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- '**/details_harness|winogrande|5_2024-02-21T00-21-24.620953.parquet'
- split: 2024_02_22T12_19_10.948803
path:
- '**/details_harness|winogrande|5_2024-02-22T12-19-10.948803.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T12-19-10.948803.parquet'
- config_name: results
data_files:
- split: 2024_02_21T00_21_24.620953
path:
- results_2024-02-21T00-21-24.620953.parquet
- split: 2024_02_22T12_19_10.948803
path:
- results_2024-02-22T12-19-10.948803.parquet
- split: latest
path:
- results_2024-02-22T12-19-10.948803.parquet
---
# Dataset Card for Evaluation run of eric111/CatunaMayo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eric111/CatunaMayo](https://huggingface.co/eric111/CatunaMayo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eric111__CatunaMayo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T12:19:10.948803](https://huggingface.co/datasets/open-llm-leaderboard/details_eric111__CatunaMayo/blob/main/results_2024-02-22T12-19-10.948803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6575531238421152,
"acc_stderr": 0.031937472435679695,
"acc_norm": 0.6570510728428876,
"acc_norm_stderr": 0.03260291830758222,
"mc1": 0.5507955936352509,
"mc1_stderr": 0.0174129419861153,
"mc2": 0.6996030299637043,
"mc2_stderr": 0.014680491046789042
},
"harness|arc:challenge|25": {
"acc": 0.6953924914675768,
"acc_stderr": 0.01344952210993249,
"acc_norm": 0.7175767918088737,
"acc_norm_stderr": 0.013155456884097224
},
"harness|hellaswag|10": {
"acc": 0.6970722963553077,
"acc_stderr": 0.004585850835623563,
"acc_norm": 0.879008165704043,
"acc_norm_stderr": 0.0032545129328064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.02415222596280158,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.02415222596280158
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.016583881958602394,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.016583881958602394
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658537,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658537
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5507955936352509,
"mc1_stderr": 0.0174129419861153,
"mc2": 0.6996030299637043,
"mc2_stderr": 0.014680491046789042
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498437
},
"harness|gsm8k|5": {
"acc": 0.7232752084912812,
"acc_stderr": 0.012323047397959795
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jan-hq/dolphin_coder_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 269516157.42079216
num_examples: 98206
- name: test
num_bytes: 29946849.57920783
num_examples: 10912
download_size: 134970100
dataset_size: 299463007.0
---
# Dataset Card for "dolphin_coder_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_liuchanghf__bloomz-3b-mmlu-lora | ---
pretty_name: Evaluation run of liuchanghf/bloomz-3b-mmlu-lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liuchanghf/bloomz-3b-mmlu-lora](https://huggingface.co/liuchanghf/bloomz-3b-mmlu-lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liuchanghf__bloomz-3b-mmlu-lora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T10:33:12.367170](https://huggingface.co/datasets/open-llm-leaderboard/details_liuchanghf__bloomz-3b-mmlu-lora/blob/main/results_2024-04-15T10-33-12.367170.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.34170084234616815,\n\
\ \"acc_stderr\": 0.03328462825216582,\n \"acc_norm\": 0.34618141980620765,\n\
\ \"acc_norm_stderr\": 0.03418266009967733,\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807765,\n \"mc2\": 0.39596727744877347,\n\
\ \"mc2_stderr\": 0.015688195773723893\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3319112627986348,\n \"acc_stderr\": 0.01376098820088054,\n\
\ \"acc_norm\": 0.3583617747440273,\n \"acc_norm_stderr\": 0.01401288333485986\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41724756024696275,\n\
\ \"acc_stderr\": 0.004920967192255289,\n \"acc_norm\": 0.5494921330412268,\n\
\ \"acc_norm_stderr\": 0.004965276587781621\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.42641509433962266,\n \"acc_stderr\": 0.030437794342983045,\n\
\ \"acc_norm\": 0.42641509433962266,\n \"acc_norm_stderr\": 0.030437794342983045\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252603,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252603\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3063583815028902,\n\
\ \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.3063583815028902,\n\
\ \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.038924311065187546,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.038924311065187546\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.04161808503501528,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.04161808503501528\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982026,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982026\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3774193548387097,\n \"acc_stderr\": 0.027575960723278246,\n \"\
acc_norm\": 0.3774193548387097,\n \"acc_norm_stderr\": 0.027575960723278246\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233485,\n \"\
acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4090909090909091,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.4090909090909091,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3626943005181347,\n \"acc_stderr\": 0.03469713791704372,\n\
\ \"acc_norm\": 0.3626943005181347,\n \"acc_norm_stderr\": 0.03469713791704372\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32564102564102565,\n \"acc_stderr\": 0.02375966576741229,\n\
\ \"acc_norm\": 0.32564102564102565,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609553,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609553\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.27310924369747897,\n \"acc_stderr\": 0.028942004040998167,\n\
\ \"acc_norm\": 0.27310924369747897,\n \"acc_norm_stderr\": 0.028942004040998167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4036697247706422,\n \"acc_stderr\": 0.021035704856574963,\n \"\
acc_norm\": 0.4036697247706422,\n \"acc_norm_stderr\": 0.021035704856574963\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2107843137254902,\n \"acc_stderr\": 0.028626547912437388,\n \"\
acc_norm\": 0.2107843137254902,\n \"acc_norm_stderr\": 0.028626547912437388\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4641350210970464,\n \"acc_stderr\": 0.03246338898055659,\n \
\ \"acc_norm\": 0.4641350210970464,\n \"acc_norm_stderr\": 0.03246338898055659\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4462809917355372,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.4462809917355372,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5339805825242718,\n \"acc_stderr\": 0.049392914472734785,\n\
\ \"acc_norm\": 0.5339805825242718,\n \"acc_norm_stderr\": 0.049392914472734785\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5085470085470085,\n\
\ \"acc_stderr\": 0.0327513030009703,\n \"acc_norm\": 0.5085470085470085,\n\
\ \"acc_norm_stderr\": 0.0327513030009703\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.41379310344827586,\n\
\ \"acc_stderr\": 0.017612204084663772,\n \"acc_norm\": 0.41379310344827586,\n\
\ \"acc_norm_stderr\": 0.017612204084663772\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.026424816594009852,\n\
\ \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.026424816594009852\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.014487500852850409,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.014487500852850409\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.026568921015457138,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.026568921015457138\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n\
\ \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.2958199356913183,\n\
\ \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3734567901234568,\n \"acc_stderr\": 0.026915003011380154,\n\
\ \"acc_norm\": 0.3734567901234568,\n \"acc_norm_stderr\": 0.026915003011380154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2770534550195567,\n\
\ \"acc_stderr\": 0.011430462443719673,\n \"acc_norm\": 0.2770534550195567,\n\
\ \"acc_norm_stderr\": 0.011430462443719673\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3088235294117647,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3758169934640523,\n \"acc_stderr\": 0.01959402113657745,\n \
\ \"acc_norm\": 0.3758169934640523,\n \"acc_norm_stderr\": 0.01959402113657745\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n\
\ \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n\
\ \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4626865671641791,\n\
\ \"acc_stderr\": 0.03525675167467974,\n \"acc_norm\": 0.4626865671641791,\n\
\ \"acc_norm_stderr\": 0.03525675167467974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807765,\n \"mc2\": 0.39596727744877347,\n\
\ \"mc2_stderr\": 0.015688195773723893\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5777426992896606,\n \"acc_stderr\": 0.013881582030658549\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/liuchanghf/bloomz-3b-mmlu-lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|arc:challenge|25_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|gsm8k|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hellaswag|10_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-33-12.367170.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T10-33-12.367170.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- '**/details_harness|winogrande|5_2024-04-15T10-33-12.367170.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T10-33-12.367170.parquet'
- config_name: results
data_files:
- split: 2024_04_15T10_33_12.367170
path:
- results_2024-04-15T10-33-12.367170.parquet
- split: latest
path:
- results_2024-04-15T10-33-12.367170.parquet
---
# Dataset Card for Evaluation run of liuchanghf/bloomz-3b-mmlu-lora
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liuchanghf/bloomz-3b-mmlu-lora](https://huggingface.co/liuchanghf/bloomz-3b-mmlu-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liuchanghf__bloomz-3b-mmlu-lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T10:33:12.367170](https://huggingface.co/datasets/open-llm-leaderboard/details_liuchanghf__bloomz-3b-mmlu-lora/blob/main/results_2024-04-15T10-33-12.367170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.34170084234616815,
"acc_stderr": 0.03328462825216582,
"acc_norm": 0.34618141980620765,
"acc_norm_stderr": 0.03418266009967733,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807765,
"mc2": 0.39596727744877347,
"mc2_stderr": 0.015688195773723893
},
"harness|arc:challenge|25": {
"acc": 0.3319112627986348,
"acc_stderr": 0.01376098820088054,
"acc_norm": 0.3583617747440273,
"acc_norm_stderr": 0.01401288333485986
},
"harness|hellaswag|10": {
"acc": 0.41724756024696275,
"acc_stderr": 0.004920967192255289,
"acc_norm": 0.5494921330412268,
"acc_norm_stderr": 0.004965276587781621
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.42641509433962266,
"acc_stderr": 0.030437794342983045,
"acc_norm": 0.42641509433962266,
"acc_norm_stderr": 0.030437794342983045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252603,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252603
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3063583815028902,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.3063583815028902,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.038924311065187546,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.038924311065187546
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.04161808503501528,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.04161808503501528
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.023456037383982026,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.023456037383982026
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3774193548387097,
"acc_stderr": 0.027575960723278246,
"acc_norm": 0.3774193548387097,
"acc_norm_stderr": 0.027575960723278246
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233485,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3626943005181347,
"acc_stderr": 0.03469713791704372,
"acc_norm": 0.3626943005181347,
"acc_norm_stderr": 0.03469713791704372
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32564102564102565,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.32564102564102565,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609553,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609553
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.27310924369747897,
"acc_stderr": 0.028942004040998167,
"acc_norm": 0.27310924369747897,
"acc_norm_stderr": 0.028942004040998167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4036697247706422,
"acc_stderr": 0.021035704856574963,
"acc_norm": 0.4036697247706422,
"acc_norm_stderr": 0.021035704856574963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.030388051301678116,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.030388051301678116
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2107843137254902,
"acc_stderr": 0.028626547912437388,
"acc_norm": 0.2107843137254902,
"acc_norm_stderr": 0.028626547912437388
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4641350210970464,
"acc_stderr": 0.03246338898055659,
"acc_norm": 0.4641350210970464,
"acc_norm_stderr": 0.03246338898055659
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4304932735426009,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.4304932735426009,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4462809917355372,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.4462809917355372,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.5339805825242718,
"acc_stderr": 0.049392914472734785,
"acc_norm": 0.5339805825242718,
"acc_norm_stderr": 0.049392914472734785
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5085470085470085,
"acc_stderr": 0.0327513030009703,
"acc_norm": 0.5085470085470085,
"acc_norm_stderr": 0.0327513030009703
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.017612204084663772,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.017612204084663772
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.026424816594009852,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.026424816594009852
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850409,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850409
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.026568921015457138,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.026568921015457138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818777,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818777
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3734567901234568,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.3734567901234568,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2770534550195567,
"acc_stderr": 0.011430462443719673,
"acc_norm": 0.2770534550195567,
"acc_norm_stderr": 0.011430462443719673
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3758169934640523,
"acc_stderr": 0.01959402113657745,
"acc_norm": 0.3758169934640523,
"acc_norm_stderr": 0.01959402113657745
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.04750185058907297,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.04750185058907297
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4626865671641791,
"acc_stderr": 0.03525675167467974,
"acc_norm": 0.4626865671641791,
"acc_norm_stderr": 0.03525675167467974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807765,
"mc2": 0.39596727744877347,
"mc2_stderr": 0.015688195773723893
},
"harness|winogrande|5": {
"acc": 0.5777426992896606,
"acc_stderr": 0.013881582030658549
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mito0o852/ContextToQuestions | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: difficulty
dtype: string
- name: question_type
dtype: string
- name: options
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 764712
num_examples: 405
download_size: 154709
dataset_size: 764712
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text2text-generation
language:
- en
tags:
- finance
- biology
pretty_name: Context To Questions Dataset
size_categories:
- n<1K
---
# Context-Based Question Generation Dataset
This dataset is designed for context-based question generation, where questions of different types (true/false, multiple-choice, open-ended) are generated based on a given context. The dataset is synthetically created using ChatGPT, providing a diverse set of questions to test comprehension and reasoning skills.
## Dataset Structure
The dataset is structured with the following fields for each example:
- `context`: The context provided as input for question generation.
- `question`: The generated question related to the context.
- `difficulty`: The difficulty level assigned to the question (e.g., "Easy", "Medium", "Hard").
- `question_type`: The type of question generated (e.g., "True/False," "Multiple Choice," "Open Ended").
- `options`: For multiple-choice questions, the list of options presented.
- `answer`: The correct answer or response to the question.
## Examples
### Example 1
**Context:**
Planning and producing responses requires an ability to make sense of the world around us. Making judgments and reasoning in the abstract are necessary to produce movements as part of larger responses...
**Question:**
According to the text, what functions are attributed to the prefrontal cortex?
**Difficulty:** Medium
**Question Type:** Multiple Choice
**Options:** ["A) Motor functions", "B) Abstract reasoning and judgment", "C) Visual processing"]
**Answer:** B) Abstract reasoning and judgment
### Example 2
**Context:**
In the mental status exam, the subtest that assesses judgment and reasoning is directed at three aspects of frontal lobe function...
**Question:**
In the mental status exam, which aspects of frontal lobe function are specifically assessed?
**Difficulty:** Medium
**Question Type:** Multiple Choice
**Options:** ["A) Motor functions", "B) Problem-solving, interpretation of proverbs, word comparisons", "C) Visual processing"]
**Answer:** B) Problem-solving, interpretation of proverbs, word comparisons
## Usage
This dataset can be utilized for training and evaluating models that focus on context-based question generation. It offers a diverse set of examples, covering various difficulty levels and question types.
Feel free to explore and contribute to this dataset to enhance its richness and applicability.
## Citation
If you use this dataset in your research or application, please cite it as follows:
```bibtex
@misc{context_based_question_dataset,
title={Context-Based Question Generation Dataset},
author={Moustapha Oumar},
year={2024},
}
``` |
emozilla/lg-nf | ---
dataset_info:
features:
- name: ID
dtype: int64
- name: Title
dtype: string
- name: VolumeInfo
dtype: string
- name: Series
dtype: string
- name: Periodical
dtype: string
- name: Author
dtype: string
- name: Year
dtype: string
- name: Edition
dtype: string
- name: Publisher
dtype: string
- name: City
dtype: string
- name: Pages
dtype: string
- name: PagesInFile
dtype: int64
- name: Language
dtype: string
- name: Topic
dtype: string
- name: Library
dtype: string
- name: Issue
dtype: string
- name: Identifier
dtype: string
- name: ISSN
dtype: string
- name: ASIN
dtype: string
- name: UDC
dtype: string
- name: LBC
dtype: string
- name: DDC
dtype: string
- name: LCC
dtype: string
- name: Doi
dtype: string
- name: Googlebookid
dtype: string
- name: OpenLibraryID
dtype: string
- name: Commentary
dtype: string
- name: DPI
dtype: int64
- name: Color
dtype: string
- name: Cleaned
dtype: string
- name: Orientation
dtype: string
- name: Paginated
dtype: string
- name: Scanned
dtype: string
- name: Bookmarked
dtype: string
- name: Searchable
dtype: string
- name: Filesize
dtype: int64
- name: Extension
dtype: string
- name: MD5
dtype: string
- name: Generic
dtype: string
- name: Visible
dtype: string
- name: Locator
dtype: string
- name: Local
dtype: int64
- name: TimeAdded
dtype: string
- name: TimeLastModified
dtype: string
- name: Coverurl
dtype: string
- name: Tags
dtype: string
- name: IdentifierWODash
dtype: string
splits:
- name: train
num_bytes: 8003252615
num_examples: 13122165
download_size: 3103416293
dataset_size: 8003252615
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "libgen-nonfiction-metadata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Toyokolabs/retinoblastoma | ---
license: cc-by-4.0
task_categories:
- question-answering
language:
- en
tags:
- biology
pretty_name: Retinoblastoma
size_categories:
- 1M<n<10M
---
Retinoblastoma Dataset
This dataset contains information related to retinoblastoma from ClinvarTuring https://github.com/ToyokoLabs/ClinvarTuring
Licensing Information
License: cc-by-4.0
Authors
Morgan Lyu, Sebastian Bassi and Virginia Gonzalez
--- |
LightTai/personalized-email | ---
license: other
---
|
bergr7/weakly_supervised_ag_news | ---
annotations_creators: []
language:
- en
language_creators:
- other
license: []
multilinguality:
- monolingual
pretty_name: Weakly supervised AG News Dataset
size_categories:
- 1K<n<10K
source_datasets:
- extended|ag_news
tags: []
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
# Dataset Card for Weakly supervised AG News Dataset
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
AG is a collection of more than 1 million news articles. News articles have been gathered from more than 2000 news sources by ComeToMyHead in more than 1 year of activity. ComeToMyHead is an academic news search engine which has been running since July, 2004. The dataset is provided by the academic comunity for research purposes in data mining (clustering, classification, etc), information retrieval (ranking, search, etc), xml, data compression, data streaming, and any other non-commercial activity. For more information, please refer to the link http://www.di.unipi.it/~gulli/AG_corpus_of_news_articles.html .
The Weakly supervised AG News Dataset was created by Team 44 of FSDL 2022 course with the only purpose of experimenting with weak supervision techniques. It was assumed that only the labels of the original test set and 20% of the training set were available. The labels in the training set were obtained by creating weak labels with LFs and denoising them with Snorkel's label model.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
text: a string feature
label: a classification label, with possible values including World (0), Sports (1), Business (2), Sci/Tech (3).
### Data Splits
- Training set with probabilistic labels from weak supervision: 37340
- Unlabeled data: 58660
- Validation set: 24000
- Test set: 7600
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to Xiang Zhang (xiang.zhang@nyu.edu) for adding this dataset to the HF Dataset Hub. |
hetline/sentiment-banking | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: 'null'
- name: metadata
struct:
- name: category
dtype: int64
- name: status
dtype: string
- name: event_timestamp
dtype: 'null'
- name: metrics
dtype: 'null'
splits:
- name: train
num_bytes: 1205760
num_examples: 5001
download_size: 448853
dataset_size: 1205760
---
# Dataset Card for "sentiment-banking"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Emm9625/xsum-shard10 | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 47921810.56637017
num_examples: 20405
- name: test
num_bytes: 2677030.5182636315
num_examples: 1134
- name: validation
num_bytes: 2631143.886163078
num_examples: 1134
download_size: 34122625
dataset_size: 53229984.970796876
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
simpletransformers/celeba_with_captions | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: string
splits:
- name: train
num_bytes: 19563162
num_examples: 24000
download_size: 4847318
dataset_size: 19563162
---
# Dataset Card for "celeba_with_captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malucoelhaofc/ScottTenormanEnglishV2 | ---
license: openrail
---
|
KaiLv/UDR_RocEnding | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: question
dtype: string
- name: target
dtype: string
- name: len_question
dtype: int64
- name: len_target
dtype: int64
splits:
- name: train
num_bytes: 22821733
num_examples: 87906
- name: validation
num_bytes: 2542405
num_examples: 9807
- name: test
num_bytes: 2542405
num_examples: 9807
- name: debug
num_bytes: 1297842
num_examples: 5000
download_size: 17953696
dataset_size: 29204385
---
# Dataset Card for "UDR_RocEnding"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
crumb/openhermes-k8 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 309315994
num_examples: 242831
download_size: 143821416
dataset_size: 309315994
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "openhermes-k8"
[teknium/openhermes](https://hf.co/datasets/teknium/openhermes) clustered with 8 clusters, included are the centroids in 'centers.pt' |
martinolmos/discursos_peron | ---
license: cc-by-sa-4.0
---
# Discursos Perรณn
Discursos completos pronunciados por el ex Presidente Juan Domingo Perรณn entre 1ro de diciembre de 1943 y el 19 de septiembre de 1955.
Los documentos, con excepciรณn de los correspondientes al aรฑo 1949, fueron suministrados por el historiador Enrique de Alzรกa, quien liderรณ un equipo que transcribiรณ a formato digital editable los originales en papel que se encuentran en el Archivo General de la Naciรณn. Los discursos del aรฑo 1949 fueron tomados de Perรณn (2016) en formato PDF.
Dado que este trabajo se realizรณ hace varios aรฑos y en distintas รฉpocas, los documentos recibidos corresponden a tres versiones diferentes de
documentos de Microsoft Word. Los discursos del aรฑo 1949 fueron tomados de Perรณn (2016)^1 en formato PDF.n La variedad y tipo de formatos de los documentos originales requiriรณ un extenso trabajo de manipulaciรณn, limpieza y ordenamiento de los datos. Para mรกs informaciรณn sobre el preprocesamiento referirse [aquรญ](https://ri.itba.edu.ar/handle/123456789/3537).
# Informaciรณn de licenciamiento
Este conjunto de datos estรก licenciado bajo la licencia internacional Creative Commons Attribution-ShareAlike 4.0 [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/).
# Informaciรณn de citado
```
@misc{discursos_peron,
author = {Olmos, Martin},
title = {Discursos Perรณn},
url = {https://github.com/martinolmos/discursos_peron},
month = {May},
year = {2022}
}
```
---
^1: Perรณn, J. D. (2016). Discursos, mensajes, correspondencia y escritos: 1949 / Perรณn (Tomos I y II). Buenos Aires, Argentina: Biblioteca del Congreso de la Naciรณn.
|
luzDP/ThiagoMinos | ---
license: openrail
---
|
HuggingFaceH4/deita-10k-v0-sft | ---
license: mit
language:
- en
size_categories:
- 1K<n<10K
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 349335841.1
num_examples: 9500
- name: test_sft
num_bytes: 18386096.9
num_examples: 500
- name: train_gen
num_bytes: 336873383
num_examples: 9500
- name: test_gen
num_bytes: 16979716
num_examples: 500
download_size: 289754284
dataset_size: 721575037.0
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
# Dataset Card for Deita 10k v0
This is a formatted version of [`hkust-nlp/deita-10k-v0`](https://huggingface.co/datasets/hkust-nlp/deita-10k-v0) to store the conversations in the same format as the OpenAI SDK.
## Citation
If you find this dataset useful, please cite the original dataset:
```
@misc{liu2023what,
title={What Makes Good Data for Alignment? A Comprehensive Study of Automatic Data Selection in Instruction Tuning},
author={Wei Liu and Weihao Zeng and Keqing He and Yong Jiang and Junxian He},
year={2023},
eprint={2312.15685},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
osellight/itchhikerGuide | ---
license: openrail
---
|
mask-distilled-one-sec-cv12/chunk_137 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1161918020
num_examples: 228185
download_size: 1185801431
dataset_size: 1161918020
---
# Dataset Card for "chunk_137"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
plaguss/curation-ultrafeedback-bad-rated | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: generations
dtype: string
- name: score_best_overall
dtype: float64
- name: rating-distilabel-gpt4
dtype: float64
- name: rationale-distilabel-gpt4
dtype: string
splits:
- name: train
num_bytes: 3681620
num_examples: 1968
download_size: 1959130
dataset_size: 3681620
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_241 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1453429928.0
num_examples: 285434
download_size: 1485140322
dataset_size: 1453429928.0
---
# Dataset Card for "chunk_241"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_find_passage_train30_eval40_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 8722
num_examples: 100
- name: validation
num_bytes: 4604
num_examples: 40
download_size: 10644
dataset_size: 13326
---
# Dataset Card for "random_letter_find_passage_train30_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/v3_train_free_concat_12 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842475632
num_examples: 2500
download_size: 1928250550
dataset_size: 3842475632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kavindu99/celeb-identities | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
0: Emilia_Clarke
1: Henry_Cavil
2: Jason_Mamoa
3: Sadie_Sink
4: Sangakkara
5: Zendaya
splits:
- name: train
num_bytes: 160371.0
num_examples: 18
download_size: 160832
dataset_size: 160371.0
---
# Dataset Card for "celeb-identities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713207591 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 25887
num_examples: 72
download_size: 21041
dataset_size: 25887
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713207591"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gardner/glaive-function-calling-v2-sharegpt | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
- name: tools
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 543530268
num_examples: 111944
- name: test
num_bytes: 4606357
num_examples: 1000
download_size: 196687702
dataset_size: 548136625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
zeyue/test | ---
license: openrail
---
|
mideind/icelandic-inflection-hard | ---
license: cc-by-4.0
---
|
autoevaluate/autoeval-eval-adversarial_qa-adversarialQA-0e2388-51771145321 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: zhangfx7/deberta-base-finetuned-squad-pruned0.1
metrics: []
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: zhangfx7/deberta-base-finetuned-squad-pruned0.1
* Dataset: adversarial_qa
* Config: adversarialQA
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@tp](https://huggingface.co/tp) for evaluating this model. |
izardy/malaysia-ejudgement | ---
dataset_name: ejudgement
description: Data source from https://kehakiman.gov.my/
language:
- en
- ms
tags:
- malaysia
- law
- judgement
---
#### This data repo consist of 3 data files
|No| Filename | File Description |
|--|----------|------------------|
|1 | edjudgement.zip | The originally scrapped (zipped) pdf files |
|2 | raw.csv | Processed data (stage 1 - non refined) from the scraped pdf |
|3 | train.csv | Processed data (stage 2 - img to txt refined) from the scraped pdf |
#### Links
- https://github.com/mesolitica/malaysian-dataset/tree/master/crawl/kehakiman.gov.my/eJudgment
|
open-llm-leaderboard/details_liminerity__Blur-7b-v1.21 | ---
pretty_name: Evaluation run of liminerity/Blur-7b-v1.21
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/Blur-7b-v1.21](https://huggingface.co/liminerity/Blur-7b-v1.21) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blur-7b-v1.21\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T13:28:00.366540](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.21/blob/main/results_2024-01-18T13-28-00.366540.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6540458763545218,\n\
\ \"acc_stderr\": 0.032093019516955965,\n \"acc_norm\": 0.6534601787133112,\n\
\ \"acc_norm_stderr\": 0.032764115724543935,\n \"mc1\": 0.5397796817625459,\n\
\ \"mc1_stderr\": 0.017448017223960867,\n \"mc2\": 0.6799010994882542,\n\
\ \"mc2_stderr\": 0.01527627642493985\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726291,\n\
\ \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.01328452529240352\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.712109141605258,\n\
\ \"acc_stderr\": 0.004518546274738885,\n \"acc_norm\": 0.8807010555666202,\n\
\ \"acc_norm_stderr\": 0.003234774980647951\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887027,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887027\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846178,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846178\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"\
acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608313,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608313\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4692737430167598,\n\
\ \"acc_stderr\": 0.016690896161944385,\n \"acc_norm\": 0.4692737430167598,\n\
\ \"acc_norm_stderr\": 0.016690896161944385\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5397796817625459,\n\
\ \"mc1_stderr\": 0.017448017223960867,\n \"mc2\": 0.6799010994882542,\n\
\ \"mc2_stderr\": 0.01527627642493985\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292406\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.01267929754951543\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/Blur-7b-v1.21
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|arc:challenge|25_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|gsm8k|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hellaswag|10_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-00.366540.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T13-28-00.366540.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- '**/details_harness|winogrande|5_2024-01-18T13-28-00.366540.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T13-28-00.366540.parquet'
- config_name: results
data_files:
- split: 2024_01_18T13_28_00.366540
path:
- results_2024-01-18T13-28-00.366540.parquet
- split: latest
path:
- results_2024-01-18T13-28-00.366540.parquet
---
# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.21
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-v1.21](https://huggingface.co/liminerity/Blur-7b-v1.21) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blur-7b-v1.21",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:28:00.366540](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.21/blob/main/results_2024-01-18T13-28-00.366540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6540458763545218,
"acc_stderr": 0.032093019516955965,
"acc_norm": 0.6534601787133112,
"acc_norm_stderr": 0.032764115724543935,
"mc1": 0.5397796817625459,
"mc1_stderr": 0.017448017223960867,
"mc2": 0.6799010994882542,
"mc2_stderr": 0.01527627642493985
},
"harness|arc:challenge|25": {
"acc": 0.6860068259385665,
"acc_stderr": 0.013562691224726291,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.01328452529240352
},
"harness|hellaswag|10": {
"acc": 0.712109141605258,
"acc_stderr": 0.004518546274738885,
"acc_norm": 0.8807010555666202,
"acc_norm_stderr": 0.003234774980647951
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887027,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887027
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.01570349834846178,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.01570349834846178
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608313,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4692737430167598,
"acc_stderr": 0.016690896161944385,
"acc_norm": 0.4692737430167598,
"acc_norm_stderr": 0.016690896161944385
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897227,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897227
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5397796817625459,
"mc1_stderr": 0.017448017223960867,
"mc2": 0.6799010994882542,
"mc2_stderr": 0.01527627642493985
},
"harness|winogrande|5": {
"acc": 0.8382004735595896,
"acc_stderr": 0.010350128010292406
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.01267929754951543
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
manishiitg/manishiitg-CogStack-QA | ---
dataset_info:
features:
- name: system
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 30586396
num_examples: 49330
download_size: 11513745
dataset_size: 30586396
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Salvatale/Santa-Maria-gemma | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 66288
num_examples: 56
download_size: 41132
dataset_size: 66288
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zeusfsx/ukrainian-news | ---
license: unknown
task_categories:
- text-generation
language:
- uk
pretty_name: ukr-news
size_categories:
- 10M<n<100M
tags:
- news
---
# Ukrainian News Dataset
This is a dataset of news articles downloaded from various Ukrainian websites and Telegram channels.
The dataset contains 22 567 099 JSON objects (news), total size ~67GB each with the following fields:
```json
title: The title of the news article
text: The text of the news article, which may contain HTML tags(e.g., paragraphs, links, images, etc.)
url: The URL of the news article
datetime: The time of publication or when the article was parsed and added to the dataset
owner: The name of the website that published the news article
```
Count of news from websites: 16 022 416
Count of telegram posts: 6 544 683
The JSON objects are divided into parts, and the dataset is available for download via Hugging Face. The terms of use state that all data in this dataset is under the copyright of the owners of the respective websites.
## Accessing the Dataset
The dataset is available for download via the Hugging Face datasets library. You can install the library via pip:
```bash
pip install datasets
```
Once you have installed the library, you can load the dataset using the following code:
```python
from datasets import load_dataset
dataset = load_dataset('zeusfsx/ukrainian-news')
```
This will load the entire dataset into memory. If you prefer to load only a subset of the data, you can specify the split argument:
```python
# Load only the first 10,000 examples from the "train" split
dataset = load_dataset('zeusfsx/ukrainian-news', split='train[:10000]')
```
## Contacts
If you have any questions or comments about this dataset, please contact me at email [zeusfsxtmp@gmail.com]. I will do our best to respond to your inquiry as soon as possible.
## License
The dataset is made available under the terms of use specified by the owners of the respective websites. Please consult the individual websites for more information on their terms of use. |
Nexdata/Spanish_Speech_Data_by_Mobile_Phone_Reading | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Spanish Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/116?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The data volumn is 227 hours. It is recorded by Spanish native speakers from Spain, Mexico and Venezuela. It is recorded in quiet environment. The recording contents cover various fields like economy, entertainment, news and spoken language. All texts are manually transcribed. The sentence accurate is 95%.
For more details, please refer to the link: https://www.nexdata.ai/datasets/116?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Spanish
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
mtc/multirc_train_merged | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 13294073
num_examples: 36051
download_size: 1418754
dataset_size: 13294073
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AnanthZeke/tamil_sentences_master_unique | ---
dataset_info:
features:
- name: sent_token
dtype: string
splits:
- name: train
num_bytes: 10655287341
num_examples: 32606463
download_size: 3795983791
dataset_size: 10655287341
---
# Dataset Card for "tamil_sentences_master_unique"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lucataco/startuplogo-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 304090.0
num_examples: 23
download_size: 139825
dataset_size: 304090.0
---
# Dataset Card for "startuplogo-captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MatsuoDochiai/MitzV | ---
license: openrail
---
|
vigneshgs7/Boundary_detection_Doc_4 | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: image
dtype: image
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap
dtype: image
splits:
- name: train
num_bytes: 8757333932.0
num_examples: 176
download_size: 579002048
dataset_size: 8757333932.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FinancialSupport/SynthEscoJobAds | ---
license: apache-2.0
dataset_info:
features:
- name: job_ad
dtype: string
- name: escoLabel
dtype: string
- name: escoSkills
dtype: string
- name: seed
dtype: string
- name: num_rewrites
dtype: int64
splits:
- name: train
num_bytes: 30656
num_examples: 10
download_size: 31951
dataset_size: 30656
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Logic123456789/Test_Liscence | ---
extra_gated_prompt: ๆไปฌ็ฟป่ฏไบCoQAๆฐๆฎ้๏ผ่ฏทไป็ป้
่ฏปไปฅไธไฟกๆฏใ
extra_gated_heading: "ๆจ้่ฆๆฅๅๅ่ฎฎๅนถๆไบคไฟกๆฏไปฅ่ทๅๆญคๆฐๆฎ้"
extra_gated_fields:
ๅงๅ: text
้ฎ็ฎฑ: text
ๆๅจ็ป็ป: text
ไฝฟ็จ็ฎ็: text
ๆๅๆไป
ๅฐๆญคๆฐๆฎ้็จไบ้ๅไธ็จ้: checkbox
extra_gated_button_content: "ๆๅทฒ้
่ฏปๅ่ฎฎๅนถๅๆๆไพ็ธๅ
ณไฟกๆฏ"
license: other
task_categories:
- question-answering
language:
- zh
- en
---
# Dataset Card for luotuo-QA-A
## Dataset Description
- **Homepage:** https://github.com/LC1332/Luotuo-Chinese-LLM
- **Repository:** https://github.com/LC1332/Luotuo-QA
- **Point of Contact:** qinyu_luo@163.com
### Dataset Summary
CoQA(Conversational Question Answering)ๆฐๆฎ้ๆฏไธไธช็จไบๅฏน่ฏๅผ้ฎ็ญไปปๅก็ๅคง่งๆจกๆฐๆฎ้๏ผๅ
ๅซ่ถ
่ฟ127,000ไธช้ฎ้ขๅๅ
ถๅฏนๅบ็็ญๆกใ่ฟไบๆๆฌๆฅ่ชไธไธชไธๅ้ขๅ็ๆฎต่ฝ๏ผๅฟ็ซฅๆ
ไบใๆๅญฆไฝๅใไธญๅญฆๅ้ซไธญ่ฑ่ฏญ่่ฏใๆฐ้ปใ็ปดๅบ็พ็งใRedditๅScienceใ
CoQAๆฐๆฎ้็ป่ฟ็ฎๅๆธ
ๆด๏ผๅ
ฑๆ7012ไธชstory๏ผๆไปฌๅจๆญคๅบ็กไธๅฐๆดไธชๆฐๆฎ้็ฟป่ฏๆไบไธญๆๅนถ่ฟ่กไบๅขๅนฟ๏ผๅ
ถไธญๆฏไธชstoryไธญๅ
ๅซ5ไธชๅทฆๅณ็้ฎ้ข๏ผๆฏไธช้ฎ้ข่ฟ่กไบ5ๆฌกๅขๅนฟใ
็ฑไบๆญคๆฐๆฎ้ๆฏๆไปฌLuotuo-QA้กน็ฎ็ไธ้จๅ๏ผๆไปฌๅฐๅฎๅซๅluotuo-QA-A,ๆจๅจไฟ่ฟๅฏน่ฏๅผ้ฎ็ญๅจไธญๆ่ฏญๅขไธ็็ ็ฉถๅๅบ็จใ
ๆจๅฏไปฅๅจ่ฟ้ๆฅ็Luotuo-QA้กน็ฎ๏ผhttps://github.com/LC1332/Luotuo-QA
ๆญคๆฐๆฎ้้็จไบ่ฎญ็ปๅ่ฏไผฐไธญๆๅฏน่ฏๅผ้ฎ็ญๆจกๅใๆ็ไบๆจๅจไธญๆ่ช็ถ่ฏญ่จๅค็้ขๅ็ๅๅฑ๏ผๅๆถไนไธบ็ ็ฉถไบบๅๅๅผๅ่
ๆไพไบไธไธชๅบๅ๏ผ็จไบๆฏ่พไธๅๆจกๅ็ๆง่ฝๅๆข็ดขๆฐ็ๆนๆณใ
ๆไปฌๅธๆ่ฟไธๅทฅไฝ่ฝๅคไฟ่ฟๅ
จ็่ๅดๅ
ไธญๆ่ฏญๅขๅฏน่ฏๅผ้ฎ็ญไปปๅก็็ ็ฉถๅ่ฟไธๆญฅ็ๅๆฐใ
The CoQA (Conversational Question Answering) dataset is a large-scale dataset for conversational question answering tasks, consisting of over 127,000 questions and their corresponding answers. These texts are derived from passages in seven different domains: children's stories, literature, middle and high school English exams, news, Wikipedia, Reddit, and Science.
The CoQA dataset has undergone simple cleaning and consists of 7,012 stories. Building upon this dataset, we have translated the entire collection into Chinese and performed augmentation. Each story contains around 5 questions, and each question has been augmented 5 times.
As this dataset is part of our Luotuo-QA project, we name this dataset as luotuo-QA-A. It aims to facilitate research and applications of conversational question answering in the Chinese language context.
You can find our Luotuo-QA project here: https://github.com/LC1332/Luotuo-QA
This dataset is suitable for training and evaluating Chinese conversational question answering models. It contributes to the advancement of Chinese natural language processing and provides researchers and developers with a benchmark to compare the performance of different models and explore new approaches.
We hope that this work will foster research and further innovation in conversational question answering tasks in the Chinese language context on a global scale.
### Languages
CHINESE
### Data Instances
```
ๆๆฌ๏ผ้ฟๅฆๅฆๆพ็ป่ฎฒ็ปๆไธไธชๆ
ไบๅฌ๏ผๅ
ๅ๏ผๆไธไธช่ฏปไนฆไบบไฝๅจๅคๅบ้็จๅ๏ผๆ้ด๏ผ ๅจ้ขๅญ้็บณๅ็ๆถๅ๏ผ็ช็ถๅฌๅฐๆไบบๅจๅซไปใ็ญๅบ็๏ผๅ้ข็ๆถ๏ผๅด่งไธไธช็พๅฅณ็ ่ธ้ฒๅจๅขๅคดไธ๏ผๅไปไธ็ฌ๏ผ้ๅปไบใไปๅพ้ซๅ
ด๏ผไฝ็ซ็ป้ฃ่ตฐๆฅๅค่ฐ็่ๅๅฐ่ฏ็ ดไบ ๆบๅ
ณใ่ฏดไป่ธไธๆไบๅฆๆฐ๏ผไธๅฎ้่งโ็พๅฅณ่โไบ๏ผ่ฟๆฏไบบ้ฆ่่บซ็ๆช็ฉ๏ผ่ฝๅคไบบ ๅ๏ผๅไธ็ญๅบ๏ผๅค้ดไพฟ่ฆๆฅๅ่ฟไบบ็่็ใไป่ช็ถๅๅพ่ฆๆญป๏ผ่้ฃ่ๅๅฐๅด้ๆ ๅฆจ ๏ผ็ปไปไธไธชๅฐ็ๅญ๏ผ่ฏดๅช่ฆๆพๅจๆ่พน๏ผไพฟๅฏ้ซๆ่ๅงใไป่ฝ็ถ็
งๆ ทๅ๏ผๅดๆปๆฏ็กไธ ็๏ผโโๅฝ็ถ็กไธ็็ใๅฐๅๅค๏ผๆ็ถๆฅไบ๏ผๆฒๆฒๆฒ๏ผ้จๅค่ฑกๆฏ้ฃ้จๅฃฐใไปๆญฃๆไฝ ไธๅขๆถ๏ผๅดๅฌๅพ่ฑ็ไธๅฃฐ๏ผไธ้้ๅ
ไปๆ่พน้ฃๅบ๏ผๅค้ขไพฟไปไนๅฃฐ้ณไนๆฒกๆไบ๏ผ้ฃ้ ๅ
ไนๅฐฑ้ฃๅๆฅ๏ผๆๅจ็ๅญ้ใๅๆฅๅข๏ผๅๆฅ๏ผ่ๅๅฐ่ฏด๏ผ่ฟๆฏ้ฃ่่ฃ๏ผๅฎ่ฝๅธ่็ ่้ซ๏ผ็พๅฅณ่ๅฐฑ่ขซๅฎๆฒปๆญปไบใ
ๅๅง้ฎ้ขไธบ๏ผ่ฐ้ๅฐไบ็พๅฅณ่๏ผ
้ฎ้ข่ฝฌไนไธบ:่ฐ่ขซ็พๅฅณ่ๆๅฐๆฐ?
็ญๆกไธบ:่ฏปไนฆไบบ
้ฎ้ข่ฝฌไนไธบ:็พๅฅณ่่ขญๅปไบ่ฐ?
็ญๆกไธบ:่ฏปไนฆไบบ
ๅๅง้ฎ้ขไธบ๏ผ่ฐๆไบ็พๅฅณ่
้ฎ้ข่ฝฌไนไธบ:่ฐๆๆญปไบ็พๅฅณ่
็ญๆกไธบ:้ฃ่่ฃ
```
### Licensing Information
ๆไปฌ็ๅ่ฎฎไธCoQAๆฐๆฎ้ๅๅงๅ่ฎฎไฟๆไธ่ด๏ผ่ฏท้
่ฏปไปฅไธๅ
ๅฎนใ
CoQAๆฐๆฎ้ๅ
ๅซๆฅ่ชไธไธช้ขๅ็ๆฎต่ฝใๆไปฌๅฐๅ
ถไธญไบไธช้ขๅ็ๆฎต่ฝไปฅไปฅไธ่ฎธๅฏ่ฏๅ
ฌๅผ๏ผ
ๆๅญฆๅ็ปดๅบ็พ็งๆฎต่ฝ้ตๅพชCC BY-SA 4.0่ฎธๅฏ่ฏๅ
ฑไบซใ
ๅฟ็ซฅๆ
ไบ้่ชMCTest๏ผ่ฏฅๆฐๆฎ้้ๅธฆMSR-LA่ฎธๅฏ่ฏใ
ไธญๅญฆ/้ซไธญ่่ฏๆฎต่ฝ้่ชRACE๏ผ่ฏฅๆฐๆฎ้ๆ่ชๅทฑ็่ฎธๅฏ่ฏใ
ๆฐ้ปๆฎต่ฝ้่ชDeepMind CNNๆฐๆฎ้๏ผ่ฏฅๆฐๆฎ้ๆApache่ฎธๅฏ่ฏใ
Our licenses aligns with the original licenses of the CoQA dataset. Please refer to the following information.
CoQA contains passages from seven domains. It make five of these public under the following licenses.
We did translation and augmentation on the CoQA dataset. Therefore, the generated part of the data still complies with the original agreement of CoQA:
Literature and Wikipedia passages are shared under CC BY-SA 4.0 license.
Children's stories are collected from MCTest which comes with MSR-LA license.
Middle/High school exam passages are collected from RACE which comes with its own license.
News passages are collected from the DeepMind CNN dataset which comes with Apache license.
### Citation Information
ๅฆๆๆจๅจ้กน็ฎไธญไฝฟ็จไบๆไปฌ็ๆจกๅใไปฃ็ ๆ่
ๆฐๆฎ๏ผ่ฏทๅผ็จๆไปฌใ
Please cite us if you use the data or code in this repo.
```bibtex
@article{your-article,
title = {Your Article Title},
author = {Author Name},
journal = {Journal Name},
year = {2023},
volume = {X},
number = {X},
pages = {X-X},
doi = {DOI}
}
```
### Contributions
Thanks to @XXX, @XXXXXX, @XXXX, @XXXXXX, @XXXXXX, @XXX for adding this dataset. |
hgissbkh/translation-preference-data | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: chosen_sentence
dtype: string
- name: rejected_sentence
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: candidates
sequence: string
- name: scores
sequence: float64
- name: src
dtype: string
- name: src_lang
dtype: string
- name: tgt_lang
dtype: string
splits:
- name: train
num_bytes: 95141992
num_examples: 22073
download_size: 25533811
dataset_size: 95141992
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NouRed/plant-disease-recognition | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1164401767.326
num_examples: 1322
download_size: 1169635181
dataset_size: 1164401767.326
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-kmfoda__booksum-kmfoda__booksum-1006ec-1466153987 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP13
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP13
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
CyberHarem/intrepid_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of intrepid (Kantai Collection)
This is the dataset of intrepid (Kantai Collection), containing 457 images and their tags.
The core tags of this character are `brown_hair, ponytail, short_hair, blue_eyes, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 457 | 385.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/intrepid_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 457 | 251.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/intrepid_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1050 | 520.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/intrepid_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 457 | 356.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/intrepid_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1050 | 676.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/intrepid_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/intrepid_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 38 |  |  |  |  |  | 1girl, black_shirt, solo, looking_at_viewer, simple_background, grey_neckerchief, smile, white_background, white_neckerchief, white_skirt, open_mouth, cowboy_shot, short_sleeves, grey_skirt, upper_body |
| 1 | 8 |  |  |  |  |  | 1girl, blush, competition_swimsuit, cowboy_shot, hair_between_eyes, looking_at_viewer, simple_background, smile, solo, white_background, twitter_username, collarbone, covered_navel, highleg_swimsuit, alternate_costume, blue_one-piece_swimsuit, cropped_legs, armpits, groin, open_mouth, arms_behind_head, arms_up, closed_mouth, huge_breasts |
| 2 | 5 |  |  |  |  |  | 1girl, collarbone, competition_swimsuit, looking_at_viewer, simple_background, solo, white_background, blue_one-piece_swimsuit, cleavage, cowboy_shot, hair_between_eyes, blush, dated, leaning_forward, one-hour_drawing_challenge, open_mouth, twitter_username |
| 3 | 14 |  |  |  |  |  | 1girl, detached_collar, fake_animal_ears, playboy_bunny, rabbit_ears, black_leotard, simple_background, solo, wrist_cuffs, looking_at_viewer, strapless_leotard, white_background, cleavage, smile, black_pantyhose, rabbit_tail, cowboy_shot, open_mouth, black_bowtie, blush |
| 4 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, full_body, navel, nipples, simple_background, blush, completely_nude, barefoot, collarbone, standing, white_background, female_pubic_hair, open_mouth, sitting, smile |
| 5 | 5 |  |  |  |  |  | 1girl, smile, solo, blue_coat, looking_at_viewer, white_scarf, hair_between_eyes, long_sleeves, official_alternate_costume, simple_background, upper_body, blush, open_mouth, shoulder_bag, skirt, standing, white_background, white_sweater |
| 6 | 10 |  |  |  |  |  | blush, hetero, solo_focus, 1boy, 1girl, nipples, open_mouth, penis, collarbone, mosaic_censoring, completely_nude, paizuri, sex, bangs, hair_between_eyes, navel, sweat, looking_at_viewer, pussy, shirt, smile, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_shirt | solo | looking_at_viewer | simple_background | grey_neckerchief | smile | white_background | white_neckerchief | white_skirt | open_mouth | cowboy_shot | short_sleeves | grey_skirt | upper_body | blush | competition_swimsuit | hair_between_eyes | twitter_username | collarbone | covered_navel | highleg_swimsuit | alternate_costume | blue_one-piece_swimsuit | cropped_legs | armpits | groin | arms_behind_head | arms_up | closed_mouth | huge_breasts | cleavage | dated | leaning_forward | one-hour_drawing_challenge | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | black_leotard | wrist_cuffs | strapless_leotard | black_pantyhose | rabbit_tail | black_bowtie | full_body | navel | nipples | completely_nude | barefoot | standing | female_pubic_hair | sitting | blue_coat | white_scarf | long_sleeves | official_alternate_costume | shoulder_bag | skirt | white_sweater | hetero | solo_focus | 1boy | penis | mosaic_censoring | paizuri | sex | bangs | sweat | pussy | shirt | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------|:--------------------|:--------------------|:-------------------|:--------|:-------------------|:--------------------|:--------------|:-------------|:--------------|:----------------|:-------------|:-------------|:--------|:-----------------------|:--------------------|:-------------------|:-------------|:----------------|:-------------------|:--------------------|:--------------------------|:---------------|:----------|:--------|:-------------------|:----------|:---------------|:---------------|:-----------|:--------|:------------------|:-----------------------------|:------------------|:-------------------|:----------------|:--------------|:----------------|:--------------|:--------------------|:------------------|:--------------|:---------------|:------------|:--------|:----------|:------------------|:-----------|:-----------|:--------------------|:----------|:------------|:--------------|:---------------|:-----------------------------|:---------------|:--------|:----------------|:---------|:-------------|:-------|:--------|:-------------------|:----------|:------|:--------|:--------|:--------|:--------|:----------|
| 0 | 38 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | X | X | X | | X | X | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | | | X | | | X | X | | | | X | X | X | X | X | | | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | | X | X | X | | X | X | | | X | X | | | | X | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | X | X | | X | X | | | X | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | X | X | | X | X | | | X | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | | X | | | X | | | | X | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B | ---
pretty_name: Evaluation run of Locutusque/Hercules-2.0-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/Hercules-2.0-Mistral-7B](https://huggingface.co/Locutusque/Hercules-2.0-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T13:12:07.013905](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B/blob/main/results_2024-02-09T13-12-07.013905.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6332465979918371,\n\
\ \"acc_stderr\": 0.03235955493460707,\n \"acc_norm\": 0.6377302097946538,\n\
\ \"acc_norm_stderr\": 0.03300999270530235,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.4396723008156011,\n\
\ \"mc2_stderr\": 0.014161167393006498\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.014426211252508397,\n\
\ \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6313483369846644,\n\
\ \"acc_stderr\": 0.004814532642574651,\n \"acc_norm\": 0.836885082652858,\n\
\ \"acc_norm_stderr\": 0.003687153940568797\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629454,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851112,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851112\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887044,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887044\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684805,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069422,\n \"\
acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069422\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.030047357655806635,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.030047357655806635\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899126,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899126\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247326,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247326\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n\
\ \"acc_stderr\": 0.01526867731760228,\n \"acc_norm\": 0.29608938547486036,\n\
\ \"acc_norm_stderr\": 0.01526867731760228\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n\
\ \"acc_stderr\": 0.012676014778580217,\n \"acc_norm\": 0.439374185136897,\n\
\ \"acc_norm_stderr\": 0.012676014778580217\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389844,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389844\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.4396723008156011,\n\
\ \"mc2_stderr\": 0.014161167393006498\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.01135031570746206\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.444275966641395,\n \
\ \"acc_stderr\": 0.013686685712261669\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/Hercules-2.0-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|arc:challenge|25_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|arc:challenge|25_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|gsm8k|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|gsm8k|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hellaswag|10_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hellaswag|10_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T19-21-33.913590.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-12-07.013905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T13-12-07.013905.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- '**/details_harness|winogrande|5_2024-02-03T19-21-33.913590.parquet'
- split: 2024_02_09T13_12_07.013905
path:
- '**/details_harness|winogrande|5_2024-02-09T13-12-07.013905.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T13-12-07.013905.parquet'
- config_name: results
data_files:
- split: 2024_02_03T19_21_33.913590
path:
- results_2024-02-03T19-21-33.913590.parquet
- split: 2024_02_09T13_12_07.013905
path:
- results_2024-02-09T13-12-07.013905.parquet
- split: latest
path:
- results_2024-02-09T13-12-07.013905.parquet
---
# Dataset Card for Evaluation run of Locutusque/Hercules-2.0-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Hercules-2.0-Mistral-7B](https://huggingface.co/Locutusque/Hercules-2.0-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T13:12:07.013905](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B/blob/main/results_2024-02-09T13-12-07.013905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6332465979918371,
"acc_stderr": 0.03235955493460707,
"acc_norm": 0.6377302097946538,
"acc_norm_stderr": 0.03300999270530235,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.4396723008156011,
"mc2_stderr": 0.014161167393006498
},
"harness|arc:challenge|25": {
"acc": 0.5793515358361775,
"acc_stderr": 0.014426211252508397,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6313483369846644,
"acc_stderr": 0.004814532642574651,
"acc_norm": 0.836885082652858,
"acc_norm_stderr": 0.003687153940568797
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629454,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851112,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887044,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887044
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684805,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.030047357655806635,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.030047357655806635
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899126,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899126
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247326,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247326
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.01526867731760228,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.01526867731760228
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580217,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580217
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.4396723008156011,
"mc2_stderr": 0.014161167393006498
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.01135031570746206
},
"harness|gsm8k|5": {
"acc": 0.444275966641395,
"acc_stderr": 0.013686685712261669
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
matemato/pokemon_bulbapedia_3_sentence | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 100831984.0
num_examples: 721
download_size: 83967282
dataset_size: 100831984.0
---
# Dataset Card for "pokemon_bulbapedia_descriptions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WebraftAI/synapsellm-v0-2-llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 14577107
num_examples: 18947
download_size: 8208827
dataset_size: 14577107
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "synapsellm-v0-2-llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/French_Speech_Data_by_Mobile_Phone_Guiding | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/French_Speech_Data_by_Mobile_Phone_Guiding
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/115?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
401 speakers participate in this recording. 50 sentences for each speaker, total 10.9 hours. Recording texts include in-car scene, smart home, smart speech assistant. Texts are accurate after manually transcribed. Recording devices are mainstream Android phones and iPhones. It can be used for in-car scene, smart home and speech assistant.
For more details, please refer to the link: https://www.nexdata.ai/datasets/115?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
French
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/97386107 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1331
dataset_size: 180
---
# Dataset Card for "97386107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kdwm/weather-sentences | ---
license: mit
---
|
ibranze/araproje_hellaswag_tr_conf_mixscore | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87122
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_mixscore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Verah/tatoeba_dedupe_en-jp_2024-March-01 | ---
license: cc-by-2.0
task_categories:
- translation
language:
- en
- ja
size_categories:
- 100K<n<1M
---
English - Japanese pairs taken from https://tatoeba.org/en/downloads and then deduplicated.
Row order has also been randomized to avoid clusters of similar translations. |
ThierryZhou/test | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
source_datasets:
- original
task_categories:
- image-to-text
task_ids:
- image-captioning
pretty_name: Test
---
# Dataset Card for "test"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Preprocessing](#dataset-preprocessing)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [RedCaps homepage](https://redcaps.xyz/)
- **Repository:** [RedCaps repository](https://github.com/redcaps-dataset/redcaps-downloader)
- **Paper:** [RedCaps: web-curated image-text data created by the people, for the people](https://arxiv.org/abs/2111.11431)
- **Leaderboard:**
- **Point of Contact:** [Karan Desai](mailto:kdexd@umich.edu)
### Dataset Summary
### Dataset Preprocessing
|
KenDoStudio/Tender-Treats_Bob_Velseb | ---
license: mit
---
|
Thytu/ChessInstruct | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- en
pretty_name: Chess Instruct
size_categories:
- 10K<n<100K
---
## ChessInstruct
The ChessInstruct Dataset serves as the foundation for training and fine-tuning Language Models (LLMs) specifically in the realm of chess instruction.
Derived from the [laion/strategic_game_chess](https://huggingface.co/datasets/laion/strategic_game_chess) dataset, this meticulously curated dataset encompasses a wide array of annotated instructional chess content.
Features of the ChessInstruct Dataset:
* **Rich and Diverse Content**: Curated with a broad spectrum of instructional resources including annotated games, strategic analyses (incoming) and positional evaluations, the dataset facilitates comprehensive learning and modeling.
* **Customizable Training Resource**: The ChessInstruct Dataset allows for the tailored fine-tuning of any Language Model, enabling researchers and practitioners to adapt and optimize LLMs for chess-specific instructional contexts.
* **Annotated Instructional Insights**: Detailed annotations and instructional cues within the dataset provide valuable guidance for language model training, emphasizing strategic moves, tactics, and decision-making processes.
## Usage
The ChessInstruct dataset comprises four primary columns:
* `task`: This column contains instruct prompts related to various chess scenarios, such as predicting the winner given a set of chess moves or identifying the last move in a sequence.
* `input`: The input column provides supplementary information, usually a series of chess moves, to support the instruct prompt. These inputs are presented as JSON-serialized strings.
* `expected_output`: This column presents the anticipated or expected output corresponding to the instruct task. The expected outputs are also serialized as JSON strings.
* `KIND`: The KIND column categorizes the type of instruct prompt, delineating the nature of the task, whether it involves identifying winning scenarios, predicting subsequent moves, or performing other chess-related analyses.
### Distribution
| Task | Number of samples training set | Number of samples test set | Distribution |
|------|--------------------------------|----------------------------|--------------|
| Finding last movement | 13500 | 1500 |15% |
| Finding game's score | 18000 | 2000 | 20% |
| Finding missing movements | 13500 | 1500 | 15% |
| Finding the best possible move to do | 18000 | 2000 | 20% |
| Finding who is advantaged in the game | 18000 | 2000 | 20% |
| Sorting FENs from earliest to oldest in the game | 9000 | 1000 | 10% |
## Reproduction
All the necessary code to reproduce this dataset is available here: [Thytu/StockLLM](https://github.com/Thytu/StockLLM)
## Citation
This dataset is based on [laion/strategic_game_chess](https://huggingface.co/datasets/laion/strategic_game_chess?row=0) which I thank dearly for the data |
lmg-anon/VNTL-v2-2k-small | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8479907
num_examples: 1666
- name: val
num_bytes: 1012198
num_examples: 199
download_size: 4197269
dataset_size: 9492105
---
# Dataset Card for "VNTL-v2-2k-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_absolute_reflex | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 57782
num_examples: 213
- name: train
num_bytes: 146272
num_examples: 539
- name: validation
num_bytes: 18479
num_examples: 68
download_size: 150052
dataset_size: 222533
---
# Dataset Card for "MULTI_VALUE_mrpc_absolute_reflex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dampish/QuickTrain | ---
license: cc-by-nc-4.0
---
|
Sgevreolete/A7 | ---
license: unknown
---
|
TICK666/Basic-Math-Chinese-1M-V1.1 | ---
license: llama2
task_categories:
- question-answering
language:
- zh
pretty_name: Basic-Math-Chinese-1M-V1.1
size_categories:
- 1M<n<10M
---
ๆฏ่พไบไธไธไธช็ๆฌ
ยท1.ๆฐๅขไบไนๆนๅๅผๆน๏ผไบๆฌกๆนๆ น๏ผ็้ข็ฎ
ยท2.ๆฐๅข็ๆๆฏไพ๏ผ
ๅๅ่ฟ็ฎ45%
ไธๅ
ไธๆฌกๆน็จ30%
ๅฎ้
้ฎ้ข15%
ไนๆนไธๅผๆน10%
ยท3.ๆฐๅขๅๅ่ฟ็ฎๅๅผ๏ผ็ๆๆถๆ20%็ๅ ็ๅจๅ้ข้ฎโ่ฟไธชๆฐ๏ผๅ ๏ผๅ๏ผไน๏ผ้ค๏ผa็ญไบๅ ๏ผโ๏ผๅฏๅ ๅ ๏ผ
่็ณปๆนๅผ๏ผqq๏ผ2981447942
bilibili๏ผไธ้ซ
ๅญTick |
CyberHarem/drake_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of drake/ใใฌใคใฏ/ๅพท้ทๅ
/๋๋ ์ดํฌ (Nikke: Goddess of Victory)
This is the dataset of drake/ใใฌใคใฏ/ๅพท้ทๅ
/๋๋ ์ดํฌ (Nikke: Goddess of Victory), containing 74 images and their tags.
The core tags of this character are `short_hair, white_hair, red_eyes, bangs, breasts, hair_ornament, earrings, parted_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 74 | 97.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/drake_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 74 | 51.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/drake_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 175 | 111.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/drake_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 74 | 83.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/drake_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 175 | 160.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/drake_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/drake_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, solo, bare_shoulders, looking_at_viewer, black_leotard, simple_background, smile, white_background, jewelry, black_gloves, elbow_gloves, covered_navel, hairclip, thighhighs |
| 1 | 8 |  |  |  |  |  | 1girl, blush, large_breasts, looking_at_viewer, nurse_cap, short_sleeves, solo, thighs, indoors, white_dress, ass, from_behind, looking_back, closed_mouth, cleavage, cowboy_shot, hairclip, sitting, white_headwear, white_panties, x_hair_ornament |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | looking_at_viewer | black_leotard | simple_background | smile | white_background | jewelry | black_gloves | elbow_gloves | covered_navel | hairclip | thighhighs | blush | large_breasts | nurse_cap | short_sleeves | thighs | indoors | white_dress | ass | from_behind | looking_back | closed_mouth | cleavage | cowboy_shot | sitting | white_headwear | white_panties | x_hair_ornament |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------------------|:----------------|:--------------------|:--------|:-------------------|:----------|:---------------|:---------------|:----------------|:-----------|:-------------|:--------|:----------------|:------------|:----------------|:---------|:----------|:--------------|:------|:--------------|:---------------|:---------------|:-----------|:--------------|:----------|:-----------------|:----------------|:------------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | | X | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
lintang/lama_primed_negated | ---
language:
- en
dataset_info:
- config_name: ConceptNet
- config_name: GoogleRE
- config_name: SQUAD
- config_name: TREx
configs:
- config_name: ConceptNet
data_files:
- split: high_ranked
path: data/ConceptNet/high_ranked/*
- split: low_ranked
path: data/ConceptNet/low_ranked/*
- split: random
path: data/ConceptNet/random/*
- config_name: GoogleRE
data_files:
- split: high_ranked
path: data/GoogleRE/high_ranked/*
- split: low_ranked
path: data/GoogleRE/low_ranked/*
- split: random
path: data/GoogleRE/random/*
- config_name: SQUAD
data_files:
- split: high_ranked
path: data/SQUAD/high_ranked/*
- split: random
path: data/SQUAD/random/*
- config_name: TREx
data_files:
- split: high_ranked
path: data/TREx/high_ranked/*
- split: low_ranked
path: data/TREx/low_ranked/*
- split: random
path: data/TREx/random/*
---
|
katossky/multi-domain-sentiment-books | ---
license: unknown
---
|
Harrietofthesea/public_test | ---
license: cc
---
|
ggarcia209/PV03 | ---
dataset_info:
features:
- name: image
sequence:
sequence:
sequence: uint8
- name: label
sequence:
sequence: uint8
splits:
- name: train
num_bytes: 971631024
num_examples: 1846
- name: validation
num_bytes: 121059120
num_examples: 230
download_size: 412805736
dataset_size: 1092690144
---
# Dataset Card for "PV03"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alvarobartt/evol-instruct | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: generations
sequence: string
- name: model_names
sequence: string
- name: output
dtype: string
- name: model_name
dtype: string
- name: evolved_instructions
sequence: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 15390
num_examples: 4
download_size: 31626
dataset_size: 15390
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Exqrch/IndonesianNMT | ---
task_categories:
- translation
language:
- id
- jv
- su
- ban
- min
---
This dataset is used on the paper ["Replicable Benchmarking of Neural Machine Translation (NMT) on Low-Resource Local Languages in Indonesia"](https://arxiv.org/abs/2311.00998).
This repository contains two types of data:
1. Monolingual (*.txt)
2. Bilingual (*.tsv)
If used, please cite
```
@misc{susanto2023replicable,
title={Replicable Benchmarking of Neural Machine Translation (NMT) on Low-Resource Local Languages in Indonesia},
author={Lucky Susanto and Ryandito Diandaru and Adila Krisnadhi and Ayu Purwarianti and Derry Wijaya},
year={2023},
eprint={2311.00998},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
This dataset is licensed under the [Creative Commons Attribution 4.0 International License (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/).
You are free to:
- Share: Copy and redistribute the material in any medium or format.
- Adapt: Remix, transform, and build upon the material for any purpose, even commercially.
Under the following terms:
- Attribution: You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
See the [full text of the license](https://creativecommons.org/licenses/by/4.0/) for more details.
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-63000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1084858
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.