datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
angdong/nate-news-world | ---
license: mit
---
|
HuggingFaceFW/fineweb_french_extract | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: string
- name: media
sequence: 'null'
- name: metadata
struct:
- name: date
dtype: string
- name: dump
dtype: string
- name: file_path
dtype: string
- name: language
dtype: string
- name: language_score
dtype: float64
- name: url
dtype: string
splits:
- name: train
num_bytes: 28672930115
num_examples: 10000000
download_size: 15018708723
dataset_size: 28672930115
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maghwa/OpenHermes-2-AR-10K-18-600k-610k | ---
dataset_info:
features:
- name: topic
dtype: 'null'
- name: views
dtype: float64
- name: category
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: language
dtype: 'null'
- name: title
dtype: 'null'
- name: model_name
dtype: 'null'
- name: idx
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: hash
dtype: 'null'
- name: model
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: source
dtype: string
- name: id
dtype: 'null'
- name: conversations
dtype: string
splits:
- name: train
num_bytes: 25345215
num_examples: 10001
download_size: 11487924
dataset_size: 25345215
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-professional_law | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 14459
num_examples: 5
- name: test
num_bytes: 14307400
num_examples: 1534
download_size: 2073867
dataset_size: 14321859
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-professional_law"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pinkbaka/rapgenius | ---
license: mit
---
|
alvarobartt/openhermes-preferences-metamath | ---
license: other
task_categories:
- text-generation
language:
- en
source_datasets:
- argilla/OpenHermesPreferences
annotations_creators:
- Argilla
- HuggingFaceH4
tags:
- dpo
- synthetic
- metamath
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 169676613.83305642
num_examples: 50799
- name: test
num_bytes: 18855183.863611557
num_examples: 5645
download_size: 44064373
dataset_size: 188531797.69666797
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for OpenHermes Preferences - MetaMath
This dataset is a subset from [`argilla/OpenHermesPreferences`](https://hf.co/datasets/argilla/OpenHermesPreferences),
only keeping the preferences of `metamath`, and removing all the columns besides the `chosen` and `rejected` ones, that
come in OpenAI chat formatting, so that's easier to fine-tune a model using tools like: [`huggingface/alignment-handbook`](https://github.com/huggingface/alignment-handbook)
or [`axolotl`](https://github.com/OpenAccess-AI-Collective/axolotl), among others.
## Reference
[`argilla/OpenHermesPreferences`](https://hf.co/datasets/argilla/OpenHermesPreferences) dataset created as a collaborative
effort between Argilla and the HuggingFaceH4 team from HuggingFace. |
DBQ/Gucci.Product.prices.Hungary | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Hungary - Gucci - Product-level price list
tags:
- webscraping
- ecommerce
- Gucci
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 2375242
num_examples: 4976
download_size: 686750
dataset_size: 2375242
---
# Gucci web scraped data
## About the website
The **luxury fashion industry** is a significant sector in the EMEA region, and especially in **Hungary**. The industry has found its footing in the digital world with the advent of **Ecommerce**, resulting in a significant surge in online sales. **Gucci**, an eminent player in this field, has established a strong online presence to capitalize on this trend. The dataset observed pertains to **Ecommerce product-list page (PLP) data** for Gucci in Hungary, where the brand witnesses considerable demand. This data provides valuable insights into consumer preferences and shopping patterns, helping players in this industry shape strategies to foster growth.
## Link to **dataset**
[Hungary - Gucci - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Gucci%20Product-prices%20Hungary/r/rec95J1ypx0pNxrgA)
|
FangyuLei/tatqa | ---
license: bsd-3-clause
---
|
joachimsallstrom/lux_np | ---
license: creativeml-openrail-m
---
|
PeterLawrence/inova8.schema.1 | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 91121
num_examples: 85
download_size: 13427
dataset_size: 91121
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arbml/AraFacts | ---
dataset_info:
features:
- name: ClaimID
dtype: string
- name: claim
dtype: string
- name: description
dtype: string
- name: source
dtype: string
- name: date
dtype: string
- name: source_label
dtype: string
- name: normalized_label
dtype: string
- name: source_category
dtype: string
- name: normalized_category
dtype: string
- name: source_url
dtype: string
- name: claim_urls
dtype: string
- name: evidence_urls
dtype: string
- name: claim_type
dtype: string
splits:
- name: train
num_bytes: 13201528
num_examples: 6222
download_size: 5719822
dataset_size: 13201528
---
# Dataset Card for "AraFacts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-9000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1080551
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
benayas/massive_artificial_10pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 803392
num_examples: 11514
download_size: 262619
dataset_size: 803392
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
indra-inc/docvqa_en_train_valid_20500 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: question
dtype: string
- name: docId
dtype: int64
- name: answers
sequence: string
- name: data_split
dtype: string
- name: bounding_boxes
sequence:
sequence: int64
- name: word_list
sequence: string
- name: image_raw
dtype: image
splits:
- name: train
num_bytes: 3168876041.0
num_examples: 20000
- name: valid
num_bytes: 236325170.0
num_examples: 500
download_size: 3158729168
dataset_size: 3405201211.0
---
# Dataset Card for "docvqa_en_train_valid_20500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kogi-jwu/instructjhumaneval | ---
license: mit
task_categories:
- text2text-generation
language:
- ja
--- |
TurboPascal/tokenizers_example_zh_en | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
- en
size_categories:
- 1M<n<10M
---
用于训练分词器的基础文本 |
ibranze/araproje_hellaswag_en_dynamic | ---
dataset_info:
features:
- name: keys
dtype: int64
- name: values
sequence: int64
splits:
- name: train
num_bytes: 23000
num_examples: 250
download_size: 6356
dataset_size: 23000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TinyPixel/elm-3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2536494
num_examples: 1073
download_size: 1390775
dataset_size: 2536494
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
udmurtNLP/madlad-400-udmurt | ---
dataset_info:
features:
- name: sent
dtype: string
splits:
- name: train
num_bytes: 102168566
num_examples: 651456
download_size: 52503390
dataset_size: 102168566
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- udm
size_categories:
- 100K<n<1M
---
# Usage madlad-400-udmurt
```py
from datasets import load_dataset
dataset = load_dataset("udmurtNLP/madlad-400-udmurt")
``` |
1rsh/tts-rj-hi-karya | ---
language:
- hi
license: mit
size_categories:
- 100K<n<1M
task_categories:
- text-to-speech
- automatic-speech-recognition
pretty_name: Rajasthani Hindi Speech Dataset
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 7425995581.812981
num_examples: 422603
- name: test
num_bytes: 74991388.79801954
num_examples: 4269
download_size: 7504372330
dataset_size: 7500986970.611
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
tags:
- webdataset
---
# Rajasthani Hindi Speech Dataset
<!-- Provide a quick summary of the dataset. -->
This dataset consists of audio recordings of participants reading out stories in Rajasthani Hindi, one sentence at a time. They had 98 participants from Soda, Rajasthan. Each participant read 30 stories. In total, we have 426872 recordings in this dataset. They had roughly 58 male participants and 40 female participants.
> **Point to Note:**
> While random sampling suggests that most users have to their best effort tried to accurately read out the sentences, we have not performed any quality analysis on the data. There could be errors in some of the recordings.
<!-- Provide a longer summary of what this dataset is. -->
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Link:** [Download](https://www.microsoft.com/en-gb/download/details.aspx?id=105385)
- **Curated By:** [Project Karya](https://www.microsoft.com/en-us/research/project/project-karya/overview/)
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Contains two headers: audio and sentence containing the Audio file and sentence respectively. |
JLuisMayor/calendario | ---
license: apache-2.0
---
|
senga-ml/dnotes-data-v1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 169798787.0
num_examples: 87
- name: validation
num_bytes: 29563876.0
num_examples: 10
- name: test
num_bytes: 16997016.0
num_examples: 6
download_size: 216159591
dataset_size: 216359679.0
---
# Dataset Card for "dnotes-data-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/wheel-chair-images-annotation4object-detec | ---
dataset_info:
features:
- name: image
dtype: image
- name: instances
list:
- name: box
sequence: float64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 25165898.049
num_examples: 1107
download_size: 0
dataset_size: 25165898.049
license: apache-2.0
task_categories:
- object-detection
language:
- en
pretty_name: wheel_chair_detection
size_categories:
- 1K<n<10K
---
# Wheelchair Dataset for Object Detection
## Dataset Information
The `dataset_info` file provides information about the wheelchair dataset designed for object detection. Here are the details:
### Features
- **image**: Represents the images in the dataset.
- Data type: `image`
- **instances**: Represents the instances within each image. Each instance consists of a bounding box and a label.
- Data type: `list`
- Sub-features:
- **box**: Bounding box coordinates for each instance.
- Data type: `float64`
- **label**: Label for each instance.
- Data type: `int64`
### Splits
- **Train**: This split, named "train," contains a total of 1,107 examples.
- Number of bytes: 25,165,898.049
- Number of examples: 1,107
### Dataset Size
- Download size: 0 (no download required)
- Dataset size: 25,165,898.049 bytes
## Wheelchair Class Name
The dataset includes the following class names for object detection:
```json
"labels": ClassLabel(names=["person", "wheel_chair", "not wheel chair"])
```
The class labels are defined as follows:
- "person"
- "wheel_chair"
- "not wheel chair"
## Object Detection Application (YOLOv Models)
You can utilize the dataset with YOLOv models for object detection tasks. The class labels for the models correspond to the defined class names mentioned above:
```json
"labels": ClassLabel(names=["person", "wheel_chair", "not wheel chair"])
```
Make sure to follow the appropriate implementation guidelines and examples for YOLOv models to leverage this dataset effectively.
```python
# Load the dataset
hf_dataset = load_dataset("your_dataset_name", split="train")
# Accessing image
image = hf_dataset[1]['image']
# Display the image
image.show()
# Accessing label and bounding box coordinates
instances = hf_dataset[1]['instances']
for instance in instances:
label = instance['label']
box = instance['box']
# Get the class name for the label
class_name = hf_dataset.features['instances']['label'].int2str(label)
print(f"Label: {class_name}")
print(f"Bounding Box: {box}")
```
## Citation
If you use this dataset in your research or any other work, please consider citing it as:
```
@dataset{wheel-chair-images-annotation4object-detec_dataset,
author = {Falah.G.Salieh},
title = {Wheelchair Dataset for Object Detection},
year = {2023},
publisher = {Hugging Face},
version = {1.0},
location = {Online},
url = {Falah/wheel-chair-images-annotation4object-detec}
}
```
## License
Wheelchair Dataset for Object Detection Dataset is provided under the Apache-2.0 license.
``` |
Cherrycreamco/g4aucfiltered | ---
license: apache-2.0
---
|
EleutherAI/quirky_squaring_increment0_bob_easy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 1595505.5
num_examples: 23000
- name: validation
num_bytes: 67316.06
num_examples: 970
- name: test
num_bytes: 68355.30625
num_examples: 985
download_size: 578238
dataset_size: 1731176.86625
---
# Dataset Card for "quirky_squaring_increment0_bob_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0 | ---
pretty_name: Evaluation run of DUAL-GPO/zephyr-7b-ipo-qlora-v0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DUAL-GPO/zephyr-7b-ipo-qlora-v0](https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-qlora-v0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T05:01:01.232002](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0/blob/main/results_2024-04-07T05-01-01.232002.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6336157909151666,\n\
\ \"acc_stderr\": 0.03259268627428965,\n \"acc_norm\": 0.6388963560492149,\n\
\ \"acc_norm_stderr\": 0.03325258805528349,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4535332788498766,\n\
\ \"mc2_stderr\": 0.014550976512746065\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.0143610972884497,\n\
\ \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.0140978106780422\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6427006572395937,\n\
\ \"acc_stderr\": 0.004782246931195,\n \"acc_norm\": 0.8436566421031667,\n\
\ \"acc_norm_stderr\": 0.0036243831208234508\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343139,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343139\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381396,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381396\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n\
\ \"acc_stderr\": 0.012676014778580215,\n \"acc_norm\": 0.439374185136897,\n\
\ \"acc_norm_stderr\": 0.012676014778580215\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128455,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4535332788498766,\n\
\ \"mc2_stderr\": 0.014550976512746065\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597221\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.400303260045489,\n \
\ \"acc_stderr\": 0.013495926436566438\n }\n}\n```"
repo_url: https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-qlora-v0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|arc:challenge|25_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|gsm8k|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hellaswag|10_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|winogrande|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T05-01-01.232002.parquet'
- config_name: results
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- results_2024-04-07T05-01-01.232002.parquet
- split: latest
path:
- results_2024-04-07T05-01-01.232002.parquet
---
# Dataset Card for Evaluation run of DUAL-GPO/zephyr-7b-ipo-qlora-v0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DUAL-GPO/zephyr-7b-ipo-qlora-v0](https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-qlora-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T05:01:01.232002](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0/blob/main/results_2024-04-07T05-01-01.232002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6336157909151666,
"acc_stderr": 0.03259268627428965,
"acc_norm": 0.6388963560492149,
"acc_norm_stderr": 0.03325258805528349,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4535332788498766,
"mc2_stderr": 0.014550976512746065
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.0143610972884497,
"acc_norm": 0.6313993174061433,
"acc_norm_stderr": 0.0140978106780422
},
"harness|hellaswag|10": {
"acc": 0.6427006572395937,
"acc_stderr": 0.004782246931195,
"acc_norm": 0.8436566421031667,
"acc_norm_stderr": 0.0036243831208234508
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532265,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343139,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343139
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381396,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381396
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580215,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580215
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128455,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078685,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078685
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4535332788498766,
"mc2_stderr": 0.014550976512746065
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597221
},
"harness|gsm8k|5": {
"acc": 0.400303260045489,
"acc_stderr": 0.013495926436566438
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mickume/wow | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 16144133
num_examples: 73565
download_size: 9966747
dataset_size: 16144133
---
# Dataset Card for "wow"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca | ---
pretty_name: Evaluation run of NbAiLab/nb-gpt-j-6B-alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NbAiLab/nb-gpt-j-6B-alpaca](https://huggingface.co/NbAiLab/nb-gpt-j-6B-alpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T15:55:19.313530](https://huggingface.co/datasets/open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca/blob/main/results_2023-07-19T15%3A55%3A19.313530.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2793361324347659,\n\
\ \"acc_stderr\": 0.03229367242405252,\n \"acc_norm\": 0.2819200448548028,\n\
\ \"acc_norm_stderr\": 0.03229676260655511,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3799804264451723,\n\
\ \"mc2_stderr\": 0.014771856203795355\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3447098976109215,\n \"acc_stderr\": 0.013888816286782114,\n\
\ \"acc_norm\": 0.36860068259385664,\n \"acc_norm_stderr\": 0.014097810678042184\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44602668791077477,\n\
\ \"acc_stderr\": 0.004960624576987787,\n \"acc_norm\": 0.574586735710018,\n\
\ \"acc_norm_stderr\": 0.004933950953380902\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.039446241625011175,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.039446241625011175\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756193,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756193\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.0336876293225943,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.0336876293225943\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319619,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319619\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322674\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238156,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238156\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03718489006818115,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03718489006818115\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n\
\ \"acc_stderr\": 0.02413763242933772,\n \"acc_norm\": 0.23548387096774193,\n\
\ \"acc_norm_stderr\": 0.02413763242933772\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.034588160421810045,\n\
\ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.034588160421810045\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.02160629449464773,\n\
\ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.02160629449464773\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609542,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609542\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.025649470265889193,\n\
\ \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.025649470265889193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.326605504587156,\n \"acc_stderr\": 0.020106990889937303,\n \"\
acc_norm\": 0.326605504587156,\n \"acc_norm_stderr\": 0.020106990889937303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674093,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674093\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2600896860986547,\n\
\ \"acc_stderr\": 0.029442495585857487,\n \"acc_norm\": 0.2600896860986547,\n\
\ \"acc_norm_stderr\": 0.029442495585857487\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.035477710041594626,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.035477710041594626\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083498,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.0356236785009539,\n \"acc_norm\": 0.16964285714285715,\n\
\ \"acc_norm_stderr\": 0.0356236785009539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531772,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531772\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n\
\ \"acc_stderr\": 0.02704685763071666,\n \"acc_norm\": 0.21794871794871795,\n\
\ \"acc_norm_stderr\": 0.02704685763071666\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3052362707535121,\n\
\ \"acc_stderr\": 0.01646771194763513,\n \"acc_norm\": 0.3052362707535121,\n\
\ \"acc_norm_stderr\": 0.01646771194763513\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.02656892101545715,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.02656892101545715\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n\
\ \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.2958199356913183,\n\
\ \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537776,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537776\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n\
\ \"acc_stderr\": 0.010906282617981657,\n \"acc_norm\": 0.23989569752281617,\n\
\ \"acc_norm_stderr\": 0.010906282617981657\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815194,\n \
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815194\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.03119223072679566,\n\
\ \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.03119223072679566\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.03484331592680588,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.03484331592680588\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3799804264451723,\n\
\ \"mc2_stderr\": 0.014771856203795355\n }\n}\n```"
repo_url: https://huggingface.co/NbAiLab/nb-gpt-j-6B-alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:55:19.313530.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:55:19.313530.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:55:19.313530.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:55:19.313530.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_55_19.313530
path:
- results_2023-07-19T15:55:19.313530.parquet
- split: latest
path:
- results_2023-07-19T15:55:19.313530.parquet
---
# Dataset Card for Evaluation run of NbAiLab/nb-gpt-j-6B-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NbAiLab/nb-gpt-j-6B-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NbAiLab/nb-gpt-j-6B-alpaca](https://huggingface.co/NbAiLab/nb-gpt-j-6B-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T15:55:19.313530](https://huggingface.co/datasets/open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca/blob/main/results_2023-07-19T15%3A55%3A19.313530.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2793361324347659,
"acc_stderr": 0.03229367242405252,
"acc_norm": 0.2819200448548028,
"acc_norm_stderr": 0.03229676260655511,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871114,
"mc2": 0.3799804264451723,
"mc2_stderr": 0.014771856203795355
},
"harness|arc:challenge|25": {
"acc": 0.3447098976109215,
"acc_stderr": 0.013888816286782114,
"acc_norm": 0.36860068259385664,
"acc_norm_stderr": 0.014097810678042184
},
"harness|hellaswag|10": {
"acc": 0.44602668791077477,
"acc_stderr": 0.004960624576987787,
"acc_norm": 0.574586735710018,
"acc_norm_stderr": 0.004933950953380902
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.039446241625011175,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.039446241625011175
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756193,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756193
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.0336876293225943,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.0336876293225943
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319619,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319619
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238156,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238156
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818115,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818115
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.02413763242933772,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.02413763242933772
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297698,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.034588160421810045,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.034588160421810045
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.02160629449464773,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.02160629449464773
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609542,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609542
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.025649470265889193,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.025649470265889193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.326605504587156,
"acc_stderr": 0.020106990889937303,
"acc_norm": 0.326605504587156,
"acc_norm_stderr": 0.020106990889937303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674093,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674093
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2600896860986547,
"acc_stderr": 0.029442495585857487,
"acc_norm": 0.2600896860986547,
"acc_norm_stderr": 0.029442495585857487
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.035477710041594626,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.035477710041594626
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083498,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.0356236785009539,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.0356236785009539
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531772,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531772
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.02704685763071666,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.02704685763071666
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3052362707535121,
"acc_stderr": 0.01646771194763513,
"acc_norm": 0.3052362707535121,
"acc_norm_stderr": 0.01646771194763513
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818777,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818777
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537776,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537776
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.010906282617981657,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.010906282617981657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.018054027458815194,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.018054027458815194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.03119223072679566,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.03119223072679566
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.03484331592680588,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.03484331592680588
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871114,
"mc2": 0.3799804264451723,
"mc2_stderr": 0.014771856203795355
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Renanriozz/renanrzrz | ---
license: afl-3.0
---
|
offenseval2020_tr | ---
annotations_creators:
- found
language_creators:
- found
language:
- tr
license:
- cc-by-2.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids: []
pretty_name: OffensEval-TR 2020
tags:
- offensive-language-classification
dataset_info:
features:
- name: id
dtype: int32
- name: tweet
dtype: string
- name: subtask_a
dtype:
class_label:
names:
'0': NOT
'1': 'OFF'
config_name: offenseval2020-turkish
splits:
- name: train
num_bytes: 4260505
num_examples: 31756
- name: test
num_bytes: 481300
num_examples: 3528
download_size: 2048258
dataset_size: 4741805
---
# Dataset Card for OffensEval-TR 2020
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [offensive-turkish](https://coltekin.github.io/offensive-turkish/)
- **Paper:** [A Corpus of Turkish Offensive Language on Social Media](https://coltekin.github.io/offensive-turkish/troff.pdf)
- **Point of Contact:** [Çağrı Çöltekin](ccoltekin@sfs.uni-tuebingen.de)
### Dataset Summary
The file offenseval-tr-training-v1.tsv contains 31,756 annotated tweets.
The file offenseval-annotation.txt contains a short summary of the annotation guidelines.
Twitter user mentions were substituted by @USER and URLs have been substitute by URL.
Each instance contains up to 1 labels corresponding to one of the following sub-task:
- Sub-task A: Offensive language identification;
### Supported Tasks and Leaderboards
The dataset was published on this [paper](https://coltekin.github.io/offensive-turkish/troff.pdf).
### Languages
The dataset is based on Turkish.
## Dataset Structure
### Data Instances
A binary dataset with with (NOT) Not Offensive and (OFF) Offensive tweets.
### Data Fields
Instances are included in TSV format as follows:
ID INSTANCE SUBA
The column names in the file are the following:
id tweet subtask_a
The labels used in the annotation are listed below.
#### Task and Labels
(A) Sub-task A: Offensive language identification
- (NOT) Not Offensive - This post does not contain offense or profanity.
- (OFF) Offensive - This post contains offensive language or a targeted (veiled or direct) offense
In our annotation, we label a post as offensive (OFF) if it contains any form of non-acceptable language (profanity) or a targeted offense, which can be veiled or direct.
### Data Splits
| train | test |
|------:|-----:|
| 31756 | 3528 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
From tweeter.
### Annotations
[More Information Needed]
#### Annotation process
We describe the labels above in a “flat” manner. However, the annotation process we follow is hierarchical. The following QA pairs give a more flowchart-like procedure to follow
1. Is the tweet in Turkish and understandable?
* No: mark tweet X for exclusion, and go to next tweet
* Yes: continue to step 2
2. Is the tweet include offensive/inappropriate language?
* No: mark the tweet non go to step 4
* Yes: continue to step 3
3. Is the offense in the tweet targeted?
* No: mark the tweet prof go to step 4
* Yes: chose one (or more) of grp, ind, *oth based on the definitions above. Please try to limit the number of labels unless it is clear that the tweet includes offense against multiple categories.
4. Was the labeling decision difficult (precise answer needs more context, tweets includes irony, or for another reason)?
* No: go to next tweet
* Yes: add the label X, go to next tweet
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The annotations are distributed under the terms of [Creative Commons Attribution License (CC-BY)](https://creativecommons.org/licenses/by/2.0/). Please cite the following paper, if you use this resource.
### Citation Information
```
@inproceedings{coltekin2020lrec,
author = {\c{C}\"{o}ltekin, \c{C}a\u{g}r{\i}},
year = {2020},
title = {A Corpus of Turkish Offensive Language on Social Media},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference},
pages = {6174--6184},
address = {Marseille, France},
url = {https://www.aclweb.org/anthology/2020.lrec-1.758},
}
```
### Contributions
Thanks to [@yavuzKomecoglu](https://github.com/yavuzKomecoglu) for adding this dataset. |
alpayariyak/SkunkData-Corpus | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: cluster
dtype: float64
splits:
- name: train
num_bytes: 5246084850
num_examples: 6320610
download_size: 2483374214
dataset_size: 5246084850
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SkunkData-Corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrqorib/MEDISCO | ---
language: id
license: gpl-3.0
task_categories:
- automatic-speech-recognition
tags:
- medical
dataset_info:
splits:
- name: train
num_bytes: 2705299088
num_examples: 360
- name: test
num_bytes: 479374176
num_examples: 360
download_size: 3184673264
---
# Building MEDISCO: Indonesian Speech Corpus for Medical Domain
The dataset was published in the following paper:
> Building MEDISCO: Indonesian Speech Corpus for Medical Domain ([PDF](https://mrqorib.github.io/assets/pdf/MEDISCO.pdf) | [IEEEXplore](https://ieeexplore.ieee.org/abstract/document/8629259)) <br>
> [Muhammad Reza Qorib](https://mrqorib.github.io/) and [Mirna Adriani](https://cs.ui.ac.id/en/personnel/mirna-adriani/) <br>
> 2018 International Conference on Asian Language Processing (IALP)
Please look for the raw files (inside the "Files and versions" tab) as the dataset viewer parsed by Huggingface does not show the text transcript.
Please direct any questions to mrqorib[at]u.nus.edu as the email addresses in the paper PDF are no longer active. |
gwlms/biofid | ---
license: cc-by-4.0
---
|
TrainingDataPro/makeup-detection-dataset | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- image-to-image
- image-classification
tags:
- code
dataset_info:
features:
- name: no_makeup
dtype: image
- name: with_makeup
dtype: image
- name: part
dtype: string
- name: gender
dtype: string
- name: age
dtype: int8
- name: country
dtype: string
splits:
- name: train
num_bytes: 25845965
num_examples: 26
download_size: 25248180
dataset_size: 25845965
---
# Makeup Detection Dataset
The dataset consists of photos featuring the same individuals captured in two distinct scenarios - *with and without makeup*. The dataset contains a diverse range of individuals with various *ages, ethnicities and genders*. The images themselves would be of high quality, ensuring clarity and detail for each subject.
In photos with makeup, it is applied **to only specific parts** of the face, such as *eyes, lips, or skin*.
In photos without makeup, individuals have a bare face with no visible cosmetics or beauty enhancements. These images would provide a clear contrast to the makeup images, allowing for significant visual analysis.
### The dataset's possible applications:
- facial recognition
- beauty consultations and personalized recommendations
- augmented reality and filters in photography apps
- social media and influencer marketing
- dermatology and skincare

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/makeup-detection?utm_source=huggingface&utm_medium=cpc&utm_campaign=makeup-detection-dataset) to discuss your requirements, learn about the price and buy the dataset.
# Content
- **no_makeup**: includes images of people *without* makeup
- **with_makeup**: includes images of people *wearing makeup*. People are the same as in the previous folder, photos are identified by the same name
- **.csv** file: contains information about people in the dataset
### File with the extension .csv
includes the following information for each set of media files:
- **no_makeup**: link to the photo of a person without makeup,
- **with_makeup**: link to the photo of the person with makeup,
- **part**: body part of makeup's application,
- **gender**: gender of the person,
- **age**: age of the person,
- **country**: country of the person
# Images for makeup detection might be collected in accordance with your requirements.
## [TrainingData](https://trainingdata.pro/data-market/makeup-detection?utm_source=huggingface&utm_medium=cpc&utm_campaign=makeup-detection-dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
miscjose/genius-music | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: title
dtype: string
- name: lyrics
dtype: string
splits:
- name: train
num_bytes: 72517163.34856215
num_examples: 27596
- name: test
num_bytes: 9065959.325718924
num_examples: 3450
- name: validation
num_bytes: 9065959.325718924
num_examples: 3450
download_size: 50430343
dataset_size: 90649082
task_categories:
- text-classification
language:
- en
tags:
- music
- song lyrics
size_categories:
- 10K<n<100K
---
# Dataset Card for "genius"
### Dataset Summary
List of Preprocessed Song Lyrics and Song Titles from [Genius](https://genius.com/)
Source data found [here](https://www.cs.cornell.edu/~arb/data/genius-expertise/). |
YBXL/PMC_CaseReport_Reasoning_test_sub | ---
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 5308861
num_examples: 500
- name: valid
num_bytes: 5308861
num_examples: 500
- name: test
num_bytes: 5308861
num_examples: 500
download_size: 7255380
dataset_size: 15926583
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
lamini/text_to_sql_finetune | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 34818227
num_examples: 16428
- name: test
num_bytes: 1050788
num_examples: 1034
download_size: 3691335
dataset_size: 35869015
---
# Dataset Card for "text_to_sql_finetune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CultriX__CombinaTrix-7B | ---
pretty_name: Evaluation run of CultriX/CombinaTrix-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CultriX/CombinaTrix-7B](https://huggingface.co/CultriX/CombinaTrix-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__CombinaTrix-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T04:05:35.594015](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__CombinaTrix-7B/blob/main/results_2024-01-26T04-05-35.594015.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545559165941275,\n\
\ \"acc_stderr\": 0.031943134755916015,\n \"acc_norm\": 0.6538633458121492,\n\
\ \"acc_norm_stderr\": 0.03261032541182886,\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.017297421448534748,\n \"mc2\": 0.706271087965262,\n\
\ \"mc2_stderr\": 0.014887346338811254\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725223,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545803\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7153953395737901,\n\
\ \"acc_stderr\": 0.004503037601847085,\n \"acc_norm\": 0.8839872535351524,\n\
\ \"acc_norm_stderr\": 0.0031958572477049163\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923992,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923992\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n\
\ \"acc_stderr\": 0.016501579306861677,\n \"acc_norm\": 0.41899441340782123,\n\
\ \"acc_norm_stderr\": 0.016501579306861677\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.017297421448534748,\n \"mc2\": 0.706271087965262,\n\
\ \"mc2_stderr\": 0.014887346338811254\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028214\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624179\n }\n}\n```"
repo_url: https://huggingface.co/CultriX/CombinaTrix-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|arc:challenge|25_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|gsm8k|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hellaswag|10_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T04-05-35.594015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T04-05-35.594015.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- '**/details_harness|winogrande|5_2024-01-26T04-05-35.594015.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T04-05-35.594015.parquet'
- config_name: results
data_files:
- split: 2024_01_26T04_05_35.594015
path:
- results_2024-01-26T04-05-35.594015.parquet
- split: latest
path:
- results_2024-01-26T04-05-35.594015.parquet
---
# Dataset Card for Evaluation run of CultriX/CombinaTrix-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CultriX/CombinaTrix-7B](https://huggingface.co/CultriX/CombinaTrix-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CultriX__CombinaTrix-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T04:05:35.594015](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__CombinaTrix-7B/blob/main/results_2024-01-26T04-05-35.594015.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545559165941275,
"acc_stderr": 0.031943134755916015,
"acc_norm": 0.6538633458121492,
"acc_norm_stderr": 0.03261032541182886,
"mc1": 0.576499388004896,
"mc1_stderr": 0.017297421448534748,
"mc2": 0.706271087965262,
"mc2_stderr": 0.014887346338811254
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725223,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545803
},
"harness|hellaswag|10": {
"acc": 0.7153953395737901,
"acc_stderr": 0.004503037601847085,
"acc_norm": 0.8839872535351524,
"acc_norm_stderr": 0.0031958572477049163
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923992,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923992
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41899441340782123,
"acc_stderr": 0.016501579306861677,
"acc_norm": 0.41899441340782123,
"acc_norm_stderr": 0.016501579306861677
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533126,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533126
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.576499388004896,
"mc1_stderr": 0.017297421448534748,
"mc2": 0.706271087965262,
"mc2_stderr": 0.014887346338811254
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028214
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624179
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Antreas/TALI | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: image_url
dtype: string
- name: item_idx
dtype: int64
- name: wit_features
struct:
- name: attribution_passes_lang_id
sequence: bool
- name: caption_alt_text_description
sequence: string
- name: caption_reference_description
sequence: string
- name: caption_title_and_reference_description
sequence: string
- name: context_page_description
sequence: string
- name: context_section_description
sequence: string
- name: hierarchical_section_title
sequence: string
- name: is_main_image
sequence: bool
- name: language
sequence: string
- name: page_changed_recently
sequence: bool
- name: page_title
sequence: string
- name: page_url
sequence: string
- name: section_title
sequence: string
- name: wit_idx
dtype: int64
- name: youtube_title_text
dtype: string
- name: youtube_description_text
dtype: string
- name: youtube_video_content
dtype: binary
- name: youtube_video_starting_time
dtype: string
- name: youtube_subtitle_text
dtype: string
- name: youtube_video_size
dtype: int64
- name: youtube_video_file_path
dtype: string
splits:
- name: train
num_bytes: 1902638101655.625
num_examples: 1052915
- name: val
num_bytes: 104485442867.25
num_examples: 57958
- name: test
num_bytes: 111107332347.375
num_examples: 61389
download_size: 2058391040534
dataset_size: 2118230876870.25
license: cc-by-4.0
task_categories:
- zero-shot-classification
tags:
- video
- audio
- text
- image
- tetramodal
- multimodal
- youtube
- wikipedia
pretty_name: TALI
size_categories:
- 1M<n<10M
---
# Dataset Card for "TALI"
## Table of Contents
1. Dataset Description
1. Abstract
2. Brief Description
2. Dataset Information
1. Modalities
2. Dataset Variants
3. Dataset Statistics
4. Data Fields
5. Data Splits
3. Dataset Creation
4. Dataset Use
5. Additional Information
## Dataset Description
### Abstract
TALI is a large-scale, tetramodal dataset designed to facilitate a shift from unimodal and duomodal to tetramodal research in deep learning. It aligns text, video, images, and audio, providing a rich resource for innovative self-supervised learning tasks and multimodal research. TALI enables exploration of how different modalities and data/model scaling affect downstream performance, with the aim of inspiring diverse research ideas and enhancing understanding of model capabilities and robustness in deep learning.
### Brief Description
TALI (Temporally and semantically Aligned Audio, Language and Images) is a dataset that uses the Wikipedia Image Text (WIT) captions and article titles to search Youtube for videos that match the captions. It then downloads the video, audio, and subtitles from these videos. The result is a rich multimodal dataset that has multiple caption types related to both the WiT Images, and the Youtube videos. This enables learning to take place between either temporally or semantically aligned text, images, audio and video.
## Dataset Information
### Modalities
The TALI dataset consists of the following modalities:
1. Image:
1. Wikipedia caption image
2. Randomly sampled image from youtube video
2. Text
1. Wikipedia Caption Text
2. Wikipedia Title Text
3. Wikipedia Main Body Text
4. YouTube Subtitle Text
5. YouTube Description Text
6. YouTube Title Text
3. Audio
1. YouTube Content Audio
4. Video
1. YouTube Content Video
## Usage:
To get started with TALI, you can load the dataset via Hugging Face's `datasets` library through our helper functions. The reason we don't use `datasets` directly is because we found huggingface_hub downloads much faster and reliable. For a full set of possible configurations look at [examples.py](examples.py). Here's a basic usage example:
First install the tali package:
### Installation
For the default install use:
```bash
pip install git+https://github.com/AntreasAntoniou/TALI
```
For the dev install use:
```bash
pip install git+https://github.com/AntreasAntoniou/TALI[dev]
```
Then use the dataset using:
### Examples
Import relevant helper functions
```python
import pathlib
from enum import Enum
import torch
from tqdm.auto import tqdm
from tali.data import (
SubModalityTypes,
TALIBaseTransform,
TALIBaseTransformConfig,
VideoFramesFormat,
default_transforms,
load_dataset_via_hub,
)
```
#### TALI with default transforms (CLIP and Whisper) and no streaming
```python
def tali_with_transforms_no_streaming(
dataset_storage_path: pathlib.Path | str,
):
if isinstance(dataset_storage_path, str):
dataset_storage_path = pathlib.Path(dataset_storage_path)
dataset = load_dataset_via_hub(
dataset_storage_path, dataset_name="Antreas/TALI"
)["train"]
(
image_transforms,
text_transforms,
audio_transforms,
video_transforms,
) = default_transforms()
preprocessing_transform = TALIBaseTransform(
cache_dir=dataset_storage_path / "cache",
text_tokenizer=text_transforms,
image_tokenizer=image_transforms,
audio_tokenizer=audio_transforms,
video_tokenizer=video_transforms,
config=TALIBaseTransformConfig(
root_filepath=dataset_storage_path,
modality_list=[
SubModalityTypes.youtube_content_video,
SubModalityTypes.youtube_content_audio,
SubModalityTypes.youtube_random_video_frame,
SubModalityTypes.youtube_subtitle_text,
SubModalityTypes.youtube_description_text,
SubModalityTypes.youtube_title_text,
SubModalityTypes.wikipedia_caption_image,
SubModalityTypes.wikipedia_caption_text,
SubModalityTypes.wikipedia_main_body_text,
SubModalityTypes.wikipedia_title_text,
],
video_frames_format=VideoFramesFormat.PIL,
),
)
for sample in tqdm(dataset):
sample = preprocessing_transform(sample)
print(list(sample.keys()))
for key, value in sample.items():
if hasattr(value, "shape"):
print(key, value.shape)
elif isinstance(value, torch.Tensor):
print(key, value.shape)
elif hasattr(value, "__len__"):
print(key, len(value))
print(key, type(value))
break
```
#### TALI with no transforms and no streaming, returning text as text, images as PIL images, videos as a list of PIL images, and audio as a sequence of floats
```python
def tali_without_transforms_no_streaming(
dataset_storage_path: pathlib.Path | str,
):
if isinstance(dataset_storage_path, str):
dataset_storage_path = pathlib.Path(dataset_storage_path)
dataset = load_dataset_via_hub(
dataset_storage_path, dataset_name="Antreas/TALI"
)["train"]
preprocessing_transform = TALIBaseTransform(
cache_dir=dataset_storage_path / "cache",
text_tokenizer=None,
image_tokenizer=None,
audio_tokenizer=None,
video_tokenizer=None,
config=TALIBaseTransformConfig(
root_filepath=dataset_storage_path,
modality_list=[
SubModalityTypes.youtube_content_video,
SubModalityTypes.youtube_content_audio,
SubModalityTypes.youtube_random_video_frame,
SubModalityTypes.youtube_subtitle_text,
SubModalityTypes.youtube_description_text,
SubModalityTypes.youtube_title_text,
SubModalityTypes.wikipedia_caption_image,
SubModalityTypes.wikipedia_caption_text,
SubModalityTypes.wikipedia_main_body_text,
SubModalityTypes.wikipedia_title_text,
],
video_frames_format=VideoFramesFormat.PIL,
),
)
for sample in tqdm(dataset):
sample = preprocessing_transform(sample)
print(list(sample.keys()))
for key, value in sample.items():
if hasattr(value, "shape"):
print(key, value.shape)
elif isinstance(value, torch.Tensor):
print(key, value.shape)
elif hasattr(value, "__len__"):
print(key, len(value))
print(key, type(value))
break
```
#### TALI with default transforms and streaming
```python
def tali_with_transforms_streaming(
dataset_storage_path: pathlib.Path | str,
):
if isinstance(dataset_storage_path, str):
dataset_storage_path = pathlib.Path(dataset_storage_path)
dataset = load_dataset_via_hub(
dataset_storage_path, dataset_name="Antreas/TALI", streaming=True
)["train"]
(
image_transforms,
text_transforms,
audio_transforms,
video_transforms,
) = default_transforms()
preprocessing_transform = TALIBaseTransform(
cache_dir=dataset_storage_path / "cache",
text_tokenizer=text_transforms,
image_tokenizer=image_transforms,
audio_tokenizer=audio_transforms,
video_tokenizer=video_transforms,
config=TALIBaseTransformConfig(
root_filepath=dataset_storage_path,
modality_list=[
SubModalityTypes.youtube_content_video,
SubModalityTypes.youtube_content_audio,
SubModalityTypes.youtube_random_video_frame,
SubModalityTypes.youtube_subtitle_text,
SubModalityTypes.youtube_description_text,
SubModalityTypes.youtube_title_text,
SubModalityTypes.wikipedia_caption_image,
SubModalityTypes.wikipedia_caption_text,
SubModalityTypes.wikipedia_main_body_text,
SubModalityTypes.wikipedia_title_text,
],
video_frames_format=VideoFramesFormat.PIL,
),
)
for sample in tqdm(dataset):
sample = preprocessing_transform(sample)
print(list(sample.keys()))
for key, value in sample.items():
if hasattr(value, "shape"):
print(key, value.shape)
elif isinstance(value, torch.Tensor):
print(key, value.shape)
elif hasattr(value, "__len__"):
print(key, len(value))
print(key, type(value))
break
```
#### TALI with no transforms and streaming, returning text as text, images as PIL images, videos as a list of PIL images, and audio as a sequence of floats
```python
def tali_without_transforms_streaming(
dataset_storage_path: pathlib.Path | str,
):
if isinstance(dataset_storage_path, str):
dataset_storage_path = pathlib.Path(dataset_storage_path)
dataset = load_dataset_via_hub(
dataset_storage_path, dataset_name="Antreas/TALI", streaming=True
)["train"]
preprocessing_transform = TALIBaseTransform(
cache_dir=dataset_storage_path / "cache",
text_tokenizer=None,
image_tokenizer=None,
audio_tokenizer=None,
video_tokenizer=None,
config=TALIBaseTransformConfig(
root_filepath=dataset_storage_path,
modality_list=[
SubModalityTypes.youtube_content_video,
SubModalityTypes.youtube_content_audio,
SubModalityTypes.youtube_random_video_frame,
SubModalityTypes.youtube_subtitle_text,
SubModalityTypes.youtube_description_text,
SubModalityTypes.youtube_title_text,
SubModalityTypes.wikipedia_caption_image,
SubModalityTypes.wikipedia_caption_text,
SubModalityTypes.wikipedia_main_body_text,
SubModalityTypes.wikipedia_title_text,
],
video_frames_format=VideoFramesFormat.PIL,
),
)
for sample in tqdm(dataset):
sample = preprocessing_transform(sample)
print(list(sample.keys()))
for key, value in sample.items():
if hasattr(value, "shape"):
print(key, value.shape)
elif isinstance(value, torch.Tensor):
print(key, value.shape)
elif hasattr(value, "__len__"):
print(key, len(value))
print(key, type(value))
break
```
### Dataset Statistics
TBA
## Dataset Creation
The TALI dataset was created by starting from the WiT dataset and using either the context_page_description or page_title as a source-query to search YouTube for video that were creative commons opted-in, and, not age restricted. The top 100 result titles were returned and compared with the source-query using the CLIP text embeddings of the largest CLIP model available. The top-1 title’s video based on the CLIP ranking was chosen and downloaded. The video was broken into 30-second segments and the top-10 segments for eachvideo were chosen based on the distance between the CLIP image embedding of the first image of each segment and the video’s title text. The image, audio, and subtitle frames were extracted from these segments. At sampling time, one of these 10 segments is randomly selected, and a 10-second segment is chosen out of the 30-second clip. The result is 200 video frames (spread throughout the 10-second segment), and 160000 audio frames (10 seconds).
## Dataset Use
TALI is designed for use in a wide range of multimodal research tasks, including but not limited to:
- Multimodal understanding and reasoning
- Self-supervised learning
- Multimodal alignment and translation
- Multimodal summarization
- Multimodal question answering
## Dataset Curators: Antreas Antoniou
Citation Information: TBA
Contributions: Thanks to all contributors including data curators, annotators, and software developers.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Edge-Pyxos/CRaQAn_v1 | ---
language:
- en
license: cc-by-4.0
size_categories:
- n<1K
task_categories:
- question-answering
pretty_name: craqan_v1
tags:
- legal
dataset_info:
features:
- name: title
dtype: string
- name: article
dtype: string
- name: article_titles
sequence: string
- name: article_sections
sequence: string
- name: section
dtype: string
- name: section_index
dtype: int64
- name: section_sentences
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: sentences_required
sequence: int64
- name: url
dtype: string
- name: time_downloaded
dtype: string
splits:
- name: train
num_bytes: 17788270
num_examples: 263
download_size: 0
dataset_size: 17788270
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Coreference Resolution in Question Answering (CRaQAn)
250+ question-answer pairs that require coreference resolution across sentences from selected Wikipedia passages.
## Generation Process
Given the relative complexity of our task (coreference resolution across passages for question-answering), we aimed
to avoid crowd-sourcing this dataset and instead focused on using LLMs to automate our process. Every question-answer
pair in the CRaQAn dataset was automatically generated using a Recursive Criticism and Improvement (RCI) loop. To
accomplish our RCI loop, we wrote a GENERATOR prompt and several REVIEWER prompts, which can be found [here](https://huggingface.co/datasets/Edge-Pyxos/CRaQAn_v1/tree/main/generation_demo/prompts).
## Review Process
Every question-answer pair in the CRaQAn v1 dataset was reviewed by at least two human reviewers. We intend for this to be a
high-trust and high-quality dataset that can be used for various applications. Every human reviewer was given the
following criteria. For each QA pair:
1. The question is clear and not ambiguous with regards to the text.
2. The question is a single question, and not two separate or related questions joined by the word "and".
3. The question does not contain or assume any information outside of the required sentences.
4. The answer is correct and reasonably terse.
5. The question-answer pair must not rely on any information from outside the required sentences.
6. The question-answer pair relies on information from each of the required sentences.
7. The number of required sentences is 2 or 3.
8. The Markdown is correctly formatted.
## CRaQAn Usage
```python
from datasets import load_dataset
import pandas as pd
from IPython.display import display, Markdown
# Load dataset.
craqan = load_dataset("Edge-Pyxos/CRaQAn_v1", split = "train")
df = pd.DataFrame(craqan)
# Fix issue with section_sentences that happens during Huggingface conversion.
df["section_sentences"] = df["section_sentences"].apply(json.loads)
# Visualize a sample from the dataset.
row = df.sample(1).squeeze()
sentences = ""
for idx, s in enumerate(row.section_sentences):
sentences += (" <mark> " + s["sentence"] + " </mark> ") if idx in row.sentences_required else " " + s["sentence"]
display(Markdown(f"# Article: {row.title}"))
display(Markdown(row.article_titles[row.section_index]))
display(Markdown(f"*Required Sentences: {row.sentences_required}*"))
display(Markdown(sentences))
display(Markdown(f"**Question**: " + row.question))
display(Markdown("**Answer**: " + row.answer))
display(Markdown("-------------------"))
```
## Demo Usage
We provide all prompts, code, and processes used to generate the CRaQAn-v1 dataset in our [demo notebook](https://huggingface.co/datasets/Edge-Pyxos/CRaQAn_v1/blob/main/generation_demo/create_dataset.ipynb).
|
result-kand2-sdxl-wuerst-karlo/80bca589 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 242
num_examples: 10
download_size: 1409
dataset_size: 242
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "80bca589"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jeovane/minhavozmr | ---
license: openrail
---
|
open-llm-leaderboard/details_huggingtweets__jerma985 | ---
pretty_name: Evaluation run of huggingtweets/jerma985
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [huggingtweets/jerma985](https://huggingface.co/huggingtweets/jerma985) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggingtweets__jerma985\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T15:13:39.388412](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__jerma985/blob/main/results_2023-09-22T15-13-39.388412.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.014786073825503355,\n\
\ \"em_stderr\": 0.0012360366760473087,\n \"f1\": 0.0371633808724832,\n\
\ \"f1_stderr\": 0.001611424008567761,\n \"acc\": 0.2533543804262036,\n\
\ \"acc_stderr\": 0.0070256103461651745\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.014786073825503355,\n \"em_stderr\": 0.0012360366760473087,\n\
\ \"f1\": 0.0371633808724832,\n \"f1_stderr\": 0.001611424008567761\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5067087608524072,\n\
\ \"acc_stderr\": 0.014051220692330349\n }\n}\n```"
repo_url: https://huggingface.co/huggingtweets/jerma985
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T15_13_39.388412
path:
- '**/details_harness|drop|3_2023-09-22T15-13-39.388412.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T15-13-39.388412.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T15_13_39.388412
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-13-39.388412.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-13-39.388412.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:38:23.212427.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:38:23.212427.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T10:38:23.212427.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T15_13_39.388412
path:
- '**/details_harness|winogrande|5_2023-09-22T15-13-39.388412.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T15-13-39.388412.parquet'
- config_name: results
data_files:
- split: 2023_07_19T10_38_23.212427
path:
- results_2023-07-19T10:38:23.212427.parquet
- split: 2023_09_22T15_13_39.388412
path:
- results_2023-09-22T15-13-39.388412.parquet
- split: latest
path:
- results_2023-09-22T15-13-39.388412.parquet
---
# Dataset Card for Evaluation run of huggingtweets/jerma985
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggingtweets/jerma985
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggingtweets/jerma985](https://huggingface.co/huggingtweets/jerma985) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggingtweets__jerma985",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T15:13:39.388412](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__jerma985/blob/main/results_2023-09-22T15-13-39.388412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.014786073825503355,
"em_stderr": 0.0012360366760473087,
"f1": 0.0371633808724832,
"f1_stderr": 0.001611424008567761,
"acc": 0.2533543804262036,
"acc_stderr": 0.0070256103461651745
},
"harness|drop|3": {
"em": 0.014786073825503355,
"em_stderr": 0.0012360366760473087,
"f1": 0.0371633808724832,
"f1_stderr": 0.001611424008567761
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5067087608524072,
"acc_stderr": 0.014051220692330349
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Vinnyyw/Dulcesolos | ---
license: openrail
---
|
AppleHarem/akane_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of akane (Blue Archive)
This is the dataset of akane (Blue Archive), containing 498 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 498 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 1352 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 1567 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 498 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 498 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 498 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 1352 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 1352 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 1256 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 1567 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 1567 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Abhi5ingh/vitonclip | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1313726702.738
num_examples: 11647
download_size: 1255546203
dataset_size: 1313726702.738
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vitonclip"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
crumb/gpt4all-clean | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 608770781
num_examples: 374269
download_size: 0
dataset_size: 608770781
license: mit
task_categories:
- conversational
language:
- en
---
# Dataset Card for "GPT4All-Clean"
The GPT4All-Clean dataset is a modified version of the original GPT4All dataset. It contains 374,269 examples, which are mostly converted to markdown format to improve consistency and compatibility with other datasets that use markdown formatting. The dataset is smaller than the original dataset, which has 437,604 examples, due to the removal of certain content. Specifically, all examples containing the phrase "As an AI language model" have been removed, as well as examples containing the string "html" to minimize potential confusion between real and non-real HTML code for the parser used to clean the examples. The intention behind these modifications is to enhance the dataset's overall quality, making it more suitable for use in research and applications. |
Hack90/virus_dna_dedup_minihash_0.9_kmer_7 | ---
dataset_info:
features:
- name: sequence_x
dtype: string
- name: similarity_filter
dtype: float64
- name: id
dtype: string
- name: sequence_y
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
- name: missing_seq_count
dtype: int64
- name: missingness
dtype: float64
- name: seq_filled
dtype: string
- name: __index_level_0__
dtype: int64
- name: spaced_sequence
dtype: string
splits:
- name: train
num_bytes: 522191271
num_examples: 10885
download_size: 234031394
dataset_size: 522191271
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "virus_dna_dedup_minihash_0.9_kmer_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhuchi76/wine_review | ---
dataset_info:
features:
- name: wine_id
dtype: int64
- name: country
dtype: string
- name: description
dtype: string
- name: designation
dtype: string
- name: points
dtype: int64
- name: price
dtype: float64
splits:
- name: train
num_bytes: 21093175.17523332
num_examples: 68918
- name: test
num_bytes: 5273446.824766681
num_examples: 17230
download_size: 15005883
dataset_size: 26366622.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
fivewords/test | ---
license: apache-2.0
language:
- zh
---
hello world |
malucoelhaofc/LauroV2 | ---
license: openrail
---
|
allenai/fos_model_training_data_open_ai_annotations | ---
extra_gated_prompt: "**AI2 ImpACT License – Low Risk Artifacts (LR Agreement)** [https://allenai.org/impact-license](https://allenai.org/impact-license)"
extra_gated_fields:
Name: text
Organization/Entity: text
Email: text
State/Country: text
"Intended Use": text
"I AGREE to the terms and conditions of the LR Agreement above": checkbox
"I AGREE to AI2’s use of my information for legal notices and administrative matters": checkbox
"I CERTIFY that the information I have provided is true and accurate": checkbox
---
|
FanChen0116/bus_few4_05x_pvi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 6172
num_examples: 35
- name: validation
num_bytes: 6900
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 0
dataset_size: 83690
---
# Dataset Card for "bus_few4_05x_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Asap7772/skewexp_maxlength | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: output
dtype: string
- name: text
dtype: string
- name: alpaca_text
dtype: string
- name: prompt
dtype: string
- name: alpaca_prompt
dtype: string
- name: y_ref
dtype: string
- name: y_1
dtype: string
- name: y_2
dtype: string
- name: y_w
dtype: string
- name: y_w_alpaca
dtype: string
- name: y_l
dtype: string
- name: y_l_alpaca
dtype: string
- name: y_w_score
dtype: float64
- name: y_l_score
dtype: float64
- name: score_diff
dtype: float64
splits:
- name: train
num_bytes: 62156813
num_examples: 19000
- name: test
num_bytes: 3233542
num_examples: 1000
download_size: 31145494
dataset_size: 65390355
---
# Dataset Card for "skewexp_maxlength"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/story_8_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3687
num_examples: 10
download_size: 5182
dataset_size: 3687
---
# Dataset Card for "story_8_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf | ---
pretty_name: Evaluation run of codellama/CodeLlama-13b-Instruct-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T02:27:47.858383](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf/blob/main/results_2023-10-16T02-27-47.858383.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.0003144653119413506,\n \"f1\": 0.05136010906040279,\n\
\ \"f1_stderr\": 0.001238131643997091,\n \"acc\": 0.4034791730120101,\n\
\ \"acc_stderr\": 0.011133121900373116\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413506,\n\
\ \"f1\": 0.05136010906040279,\n \"f1_stderr\": 0.001238131643997091\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12661106899166036,\n \
\ \"acc_stderr\": 0.009159715283081094\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6803472770323599,\n \"acc_stderr\": 0.013106528517665136\n\
\ }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|arc:challenge|25_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T02_27_47.858383
path:
- '**/details_harness|drop|3_2023-10-16T02-27-47.858383.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T02-27-47.858383.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T02_27_47.858383
path:
- '**/details_harness|gsm8k|5_2023-10-16T02-27-47.858383.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T02-27-47.858383.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hellaswag|10_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T02_27_47.858383
path:
- '**/details_harness|winogrande|5_2023-10-16T02-27-47.858383.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T02-27-47.858383.parquet'
- config_name: results
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- results_2023-08-25T17:15:30.693025.parquet
- split: 2023_10_16T02_27_47.858383
path:
- results_2023-10-16T02-27-47.858383.parquet
- split: latest
path:
- results_2023-10-16T02-27-47.858383.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-13b-Instruct-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T02:27:47.858383](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf/blob/main/results_2023-10-16T02-27-47.858383.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413506,
"f1": 0.05136010906040279,
"f1_stderr": 0.001238131643997091,
"acc": 0.4034791730120101,
"acc_stderr": 0.011133121900373116
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413506,
"f1": 0.05136010906040279,
"f1_stderr": 0.001238131643997091
},
"harness|gsm8k|5": {
"acc": 0.12661106899166036,
"acc_stderr": 0.009159715283081094
},
"harness|winogrande|5": {
"acc": 0.6803472770323599,
"acc_stderr": 0.013106528517665136
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_HanNayeoniee__LHK | ---
pretty_name: Evaluation run of HanNayeoniee/LHK
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HanNayeoniee/LHK](https://huggingface.co/HanNayeoniee/LHK) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HanNayeoniee__LHK\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T08:36:46.255504](https://huggingface.co/datasets/open-llm-leaderboard/details_HanNayeoniee__LHK/blob/main/results_2024-01-19T08-36-46.255504.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522091397950804,\n\
\ \"acc_stderr\": 0.031753247805341805,\n \"acc_norm\": 0.6548324288435591,\n\
\ \"acc_norm_stderr\": 0.032386658093364704,\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476848,\n \"mc2\": 0.591200906179408,\n\
\ \"mc2_stderr\": 0.01538726882622229\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n\
\ \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205766\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657239593706433,\n\
\ \"acc_stderr\": 0.004736621698861176,\n \"acc_norm\": 0.844851623182633,\n\
\ \"acc_norm_stderr\": 0.0036130615166899823\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503575,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503575\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603627,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634342,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634342\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586234,\n \
\ \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586234\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786747,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786747\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n\
\ \"acc_stderr\": 0.01518384430720616,\n \"acc_norm\": 0.2905027932960894,\n\
\ \"acc_norm_stderr\": 0.01518384430720616\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046102,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046102\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n\
\ \"acc_stderr\": 0.01275841094103892,\n \"acc_norm\": 0.4784876140808344,\n\
\ \"acc_norm_stderr\": 0.01275841094103892\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625166,\n\
\ \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625166\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.017270015284476848,\n \"mc2\": 0.591200906179408,\n\
\ \"mc2_stderr\": 0.01538726882622229\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.01103033579861744\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5633055344958302,\n \
\ \"acc_stderr\": 0.01366164978090549\n }\n}\n```"
repo_url: https://huggingface.co/HanNayeoniee/LHK
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|arc:challenge|25_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|gsm8k|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hellaswag|10_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T08-36-46.255504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T08-36-46.255504.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- '**/details_harness|winogrande|5_2024-01-19T08-36-46.255504.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T08-36-46.255504.parquet'
- config_name: results
data_files:
- split: 2024_01_19T08_36_46.255504
path:
- results_2024-01-19T08-36-46.255504.parquet
- split: latest
path:
- results_2024-01-19T08-36-46.255504.parquet
---
# Dataset Card for Evaluation run of HanNayeoniee/LHK
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HanNayeoniee/LHK](https://huggingface.co/HanNayeoniee/LHK) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HanNayeoniee__LHK",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T08:36:46.255504](https://huggingface.co/datasets/open-llm-leaderboard/details_HanNayeoniee__LHK/blob/main/results_2024-01-19T08-36-46.255504.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522091397950804,
"acc_stderr": 0.031753247805341805,
"acc_norm": 0.6548324288435591,
"acc_norm_stderr": 0.032386658093364704,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476848,
"mc2": 0.591200906179408,
"mc2_stderr": 0.01538726882622229
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.014124597881844461,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205766
},
"harness|hellaswag|10": {
"acc": 0.657239593706433,
"acc_stderr": 0.004736621698861176,
"acc_norm": 0.844851623182633,
"acc_norm_stderr": 0.0036130615166899823
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361073,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361073
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337124,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337124
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503575,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503575
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603627,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634342,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634342
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8312236286919831,
"acc_stderr": 0.024381406832586234,
"acc_norm": 0.8312236286919831,
"acc_norm_stderr": 0.024381406832586234
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786747,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786747
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2905027932960894,
"acc_stderr": 0.01518384430720616,
"acc_norm": 0.2905027932960894,
"acc_norm_stderr": 0.01518384430720616
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046102,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046102
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.01275841094103892,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.01275841094103892
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476848,
"mc2": 0.591200906179408,
"mc2_stderr": 0.01538726882622229
},
"harness|winogrande|5": {
"acc": 0.8097868981846882,
"acc_stderr": 0.01103033579861744
},
"harness|gsm8k|5": {
"acc": 0.5633055344958302,
"acc_stderr": 0.01366164978090549
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dodosh/CodeSearchNet | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: code
dtype: string
- name: docstring
dtype: string
splits:
- name: train
num_bytes: 417539337
num_examples: 457461
download_size: 193602075
dataset_size: 417539337
---
# Dataset Card for "CodeSearchNet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
grosenthal/lat_en_loeb_whitaker_split | ---
dataset_info:
features:
- name: id
dtype: int64
- name: la
dtype: string
- name: en
dtype: string
- name: file
dtype: string
splits:
- name: train
num_bytes: 30517119.261391733
num_examples: 77774
download_size: 18966593
dataset_size: 30517119.261391733
---
# Dataset Card for "lat_en_loeb_whitaker_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lawinstruct/lawinstruct | ---
annotations_creators:
- other
language_creators:
- found
language:
- bg
- cs
- da
- de
- el
- en
- es
- et
- fi
- fr
- ga
- hr
- hu
- it
- lt
- lv
- mt
- nl
- pl
- pt
- ro
- sk
- sl
- sv
- zh
- ja
- ko
license:
- mit
multilinguality:
- multilingual
paperswithcode_id: null
pretty_name: "LawInstruct: A Diverse Multilingual Dataset for Legal Instruction Tuning"
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- fill-mask
---
# Dataset Card for LawInstruct: A Diverse Multilingual Dataset for Legal Instruction Tuning
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** [GitHub](https://github.com/JoelNiklaus/LawInstruct)
- **Paper:** [ArXiv](https://arxiv.org/abs/2404.02127)
- **Leaderboard:**
- **Point of Contact:** [Joel Niklaus](mailto:joel@niklaus.ai)
### Dataset Summary
LawInstruct is a diverse multilingual dataset for legal instruction tuning.
### Supported Tasks and Leaderboards
The dataset supports the tasks of text-generation.
### Languages
The following languages are supported:
bg, cs, da, de, el, en, es, et, fi, fr, ga, hr, hu, it, lt, lv, mt, nl, pl, pt, ro, sk, sl, sv, zh, ja, ko
## Dataset Structure
It is structured in the following format: {name}_train.{shard}.jsonl.xz
LawInstruct has the following data fields:
- `dataset_name`: The name of the dataset
- `subset_name`: The name of the sub dataset if applicable
- `source`: The url of the source
- `prompt_language`: The language of the prompt
- `answer_language`: The language of the answer
- `jurisdiction`: The jurisdiction of the dataset
- `task_type`: The task type of the dataset
- `downloaded_timestamp`: The timestamp when the dataset was built for lawinstruct
- `text`: the text, consisting of the prompt and the answer
### Data Instances
The file format is jsonl.xz and there is a `train` split available.
### Data Fields
[More Information Needed]
### Data Splits
There is one split: train.
#### Data Size
```bash
$ xz --list data/*.xz
Strms Blocks Compressed Uncompressed Ratio Check Filename
1 1 515.4 KiB 3445.8 KiB 0.150 CRC64 data/BrazilianBarExam-brazilian_bar_exam-train-0.jsonl.xz
1 1 379.6 MiB 6327.9 MiB 0.060 CRC64 data/BrCAD5-brcad5_judgment-train-0.jsonl.xz
1 1 379.7 MiB 6336.8 MiB 0.060 CRC64 data/BrCAD5-brcad5_law_area-train-0.jsonl.xz
1 1 461.7 MiB 12.6 GiB 0.036 CRC64 data/BrCAD5-brcad5_mc-train-0.jsonl.xz
1 1 513.0 MiB 18.6 GiB 0.027 CRC64 data/BrCAD5-brcad5_topic-train-0.jsonl.xz
1 1 334.0 KiB 8444.7 KiB 0.040 CRC64 data/BVADecisions-bva_decisions_label-train-0.jsonl.xz
1 1 1416 B 5329 B 0.266 CRC64 data/BVADecisions-bva_decisions_qa-train-0.jsonl.xz
1 1 1535.5 KiB 7091.5 KiB 0.217 CRC64 data/CABarExamEssays-MainSubset-train-0.jsonl.xz
1 1 4541.7 KiB 140.3 MiB 0.032 CRC64 data/CAIL2019-cail_2019-train-0.jsonl.xz
1 1 609.7 KiB 14.4 MiB 0.041 CRC64 data/CAIL2022-cail_2022_crime-train-0.jsonl.xz
1 1 797.2 KiB 15.3 MiB 0.051 CRC64 data/CAIL2022-cail_2022_mc-train-0.jsonl.xz
1 1 518.4 KiB 8591.2 KiB 0.060 CRC64 data/CAIL2022-cail_2022_response-train-0.jsonl.xz
1 1 1344.6 KiB 7666.5 KiB 0.175 CRC64 data/CaseBriefs-case_briefs-train-0.jsonl.xz
1 1 2724.2 KiB 13.8 MiB 0.193 CRC64 data/ChangeMyView-change_my_view-train-0.jsonl.xz
1 1 1368.5 KiB 90.8 MiB 0.015 CRC64 data/ContractNLI-contract_nli-train-0.jsonl.xz
1 1 497.4 KiB 10.9 MiB 0.044 CRC64 data/EdgarNER-MainSubset-train-0.jsonl.xz
1 1 3001.3 KiB 406.2 MiB 0.007 CRC64 data/Ell4GreekNER-MainSubset-train-0.jsonl.xz
1 1 3410.7 KiB 705.6 MiB 0.005 CRC64 data/Ell18GreekNER-MainSubset-train-0.jsonl.xz
1 1 1592.2 KiB 17.2 MiB 0.090 CRC64 data/EOIRPrivacy-eoir_privacy-train-0.jsonl.xz
1 1 19.1 MiB 400.8 MiB 0.048 CRC64 data/EurLexSum-bulgarian-train-0.jsonl.xz
1 1 12.3 MiB 80.6 MiB 0.153 CRC64 data/EurLexSum-croatian-train-0.jsonl.xz
1 1 15.1 MiB 128.5 MiB 0.117 CRC64 data/EurLexSum-czech-train-0.jsonl.xz
1 1 13.0 MiB 94.1 MiB 0.138 CRC64 data/EurLexSum-danish-train-0.jsonl.xz
1 1 13.5 MiB 94.6 MiB 0.142 CRC64 data/EurLexSum-dutch-train-0.jsonl.xz
1 1 13.2 MiB 91.4 MiB 0.144 CRC64 data/EurLexSum-english-train-0.jsonl.xz
1 1 13.7 MiB 95.3 MiB 0.144 CRC64 data/EurLexSum-estonian-train-0.jsonl.xz
1 1 13.8 MiB 103.3 MiB 0.133 CRC64 data/EurLexSum-finnish-train-0.jsonl.xz
1 1 14.7 MiB 117.0 MiB 0.126 CRC64 data/EurLexSum-french-train-0.jsonl.xz
1 1 14.6 MiB 104.8 MiB 0.139 CRC64 data/EurLexSum-german-train-0.jsonl.xz
1 1 20.6 MiB 432.8 MiB 0.048 CRC64 data/EurLexSum-greek-train-0.jsonl.xz
1 1 15.3 MiB 135.7 MiB 0.113 CRC64 data/EurLexSum-hungarian-train-0.jsonl.xz
1 1 608.6 KiB 4496.0 KiB 0.135 CRC64 data/EurLexSum-irish-train-0.jsonl.xz
1 1 13.5 MiB 96.5 MiB 0.140 CRC64 data/EurLexSum-italian-train-0.jsonl.xz
1 1 14.4 MiB 116.5 MiB 0.123 CRC64 data/EurLexSum-latvian-train-0.jsonl.xz
1 1 14.4 MiB 107.1 MiB 0.135 CRC64 data/EurLexSum-lithuanian-train-0.jsonl.xz
1 1 14.8 MiB 109.0 MiB 0.136 CRC64 data/EurLexSum-maltese-train-0.jsonl.xz
1 1 14.5 MiB 108.6 MiB 0.134 CRC64 data/EurLexSum-polish-train-0.jsonl.xz
1 1 13.3 MiB 102.0 MiB 0.131 CRC64 data/EurLexSum-portuguese-train-0.jsonl.xz
1 1 14.3 MiB 112.4 MiB 0.127 CRC64 data/EurLexSum-romanian-train-0.jsonl.xz
1 1 14.5 MiB 114.4 MiB 0.127 CRC64 data/EurLexSum-slovak-train-0.jsonl.xz
1 1 13.1 MiB 86.5 MiB 0.152 CRC64 data/EurLexSum-slovenian-train-0.jsonl.xz
1 1 14.1 MiB 106.3 MiB 0.133 CRC64 data/EurLexSum-spanish-train-0.jsonl.xz
1 1 12.8 MiB 95.9 MiB 0.134 CRC64 data/EurLexSum-swedish-train-0.jsonl.xz
1 1 3062.1 KiB 47.7 MiB 0.063 CRC64 data/GermanLER-coarse-train-0.jsonl.xz
1 1 3109.2 KiB 53.9 MiB 0.056 CRC64 data/GermanLER-fine-train-0.jsonl.xz
1 1 66.8 KiB 2320.1 KiB 0.029 CRC64 data/GermanRentalAgreements-german_rental_agreements-train-0.jsonl.xz
1 1 126.6 MiB 718.1 MiB 0.176 CRC64 data/ILDC-ildc-train-0.jsonl.xz
1 1 630.1 KiB 8302.7 KiB 0.076 CRC64 data/IndianNER-MainSubset-train-0.jsonl.xz
1 1 1338.5 KiB 31.7 MiB 0.041 CRC64 data/IndianTextSegmentation-indian_text_segmentation-train-0.jsonl.xz
1 1 222.8 KiB 6645.0 KiB 0.034 CRC64 data/InternationalCitizenshipLawQuestions-international_citizenship_law_questions_mode_acq-train-0.jsonl.xz
1 1 97.9 KiB 2976.9 KiB 0.033 CRC64 data/InternationalCitizenshipLawQuestions-international_citizenship_law_questions_mode_loss-train-0.jsonl.xz
1 1 1073.5 KiB 11.7 MiB 0.090 CRC64 data/KoreanLegalQA-korean_legal_qa-train-0.jsonl.xz
1 1 891.6 MiB 15.8 GiB 0.055 CRC64 data/LawngNli-lawng_nli_entailment-train-0.jsonl.xz
1 1 3677.7 KiB 48.7 MiB 0.074 CRC64 data/LboxOpen-lbox_open_judgment-train-0.jsonl.xz
1 1 2789.0 KiB 39.1 MiB 0.070 CRC64 data/LboxOpen-lbox_open_statute-train-0.jsonl.xz
1 1 37.0 MiB 207.2 MiB 0.178 CRC64 data/LegalCaseDocumentSummarization-legal_case_summarization_india-train-0.jsonl.xz
1 1 10.6 MiB 61.0 MiB 0.174 CRC64 data/LegalCaseDocumentSummarization-legal_case_summarization_uk-train-0.jsonl.xz
1 1 2445.9 KiB 34.0 MiB 0.070 CRC64 data/LegalQA-legal_qa-train-0.jsonl.xz
1 1 15.0 MiB 104.7 MiB 0.143 CRC64 data/LexGLUE-case_hold-train-0.jsonl.xz
1 1 15.9 MiB 94.9 MiB 0.168 CRC64 data/LexGLUE-ecthr_a-train-0.jsonl.xz
1 1 15.9 MiB 96.0 MiB 0.166 CRC64 data/LexGLUE-ecthr_b-train-0.jsonl.xz
1 1 57.5 MiB 412.8 MiB 0.139 CRC64 data/LexGLUE-eurlex-train-0.jsonl.xz
1 1 6186.6 KiB 77.5 MiB 0.078 CRC64 data/LexGLUE-ledgar-train-0.jsonl.xz
1 1 36.5 MiB 177.4 MiB 0.205 CRC64 data/LexGLUE-scotus-train-0.jsonl.xz
1 1 282.3 KiB 6855.6 KiB 0.041 CRC64 data/LexGLUE-unfair_tos-train-0.jsonl.xz
1 1 473.9 KiB 5375.4 KiB 0.088 CRC64 data/LEXTREME-brazilian_court_decisions_judgment-train-0.jsonl.xz
1 1 276.9 KiB 2839.1 KiB 0.098 CRC64 data/LEXTREME-brazilian_court_decisions_unanimity-train-0.jsonl.xz
1 1 236.5 KiB 4782.8 KiB 0.049 CRC64 data/LEXTREME-covid19_emergency_event-train-0.jsonl.xz
1 1 1056.2 KiB 19.1 MiB 0.054 CRC64 data/LEXTREME-german_argument_mining-train-0.jsonl.xz
1 1 34.0 MiB 592.1 MiB 0.057 CRC64 data/LEXTREME-greek_legal_code_chapter-train-0.jsonl.xz
1 1 34.1 MiB 593.3 MiB 0.058 CRC64 data/LEXTREME-greek_legal_code_subject-train-0.jsonl.xz
1 1 33.9 MiB 592.0 MiB 0.057 CRC64 data/LEXTREME-greek_legal_code_volume-train-0.jsonl.xz
1 1 863.6 KiB 27.7 MiB 0.030 CRC64 data/LEXTREME-greek_legal_ner-train-0.jsonl.xz
1 1 296.2 KiB 7285.0 KiB 0.041 CRC64 data/LEXTREME-legalnero-train-0.jsonl.xz
1 1 405.9 KiB 8646.6 KiB 0.047 CRC64 data/LEXTREME-lener_br-train-0.jsonl.xz
1 1 1839.2 KiB 31.6 MiB 0.057 CRC64 data/LEXTREME-mapa_coarse-train-0.jsonl.xz
1 1 1869.8 KiB 37.0 MiB 0.049 CRC64 data/LEXTREME-mapa_fine-train-0.jsonl.xz
1 1 959.5 MiB 7131.3 MiB 0.135 CRC64 data/LEXTREME-multi_eurlex_level_1-train-0.jsonl.xz
1 1 392.9 MiB 2914.0 MiB 0.135 CRC64 data/LEXTREME-multi_eurlex_level_1-train-1.jsonl.xz
1 1 959.6 MiB 7137.3 MiB 0.134 CRC64 data/LEXTREME-multi_eurlex_level_2-train-0.jsonl.xz
1 1 393.2 MiB 2918.7 MiB 0.135 CRC64 data/LEXTREME-multi_eurlex_level_2-train-1.jsonl.xz
1 1 959.6 MiB 7142.6 MiB 0.134 CRC64 data/LEXTREME-multi_eurlex_level_3-train-0.jsonl.xz
1 1 393.6 MiB 2923.5 MiB 0.135 CRC64 data/LEXTREME-multi_eurlex_level_3-train-1.jsonl.xz
1 1 997.2 KiB 24.6 MiB 0.040 CRC64 data/LEXTREME-online_terms_of_service_clause_topics-train-0.jsonl.xz
1 1 163.1 KiB 2028.8 KiB 0.080 CRC64 data/LEXTREME-online_terms_of_service_unfairness_levels-train-0.jsonl.xz
1 1 28.9 MiB 257.6 MiB 0.112 CRC64 data/LEXTREME-swiss_judgment_prediction-train-0.jsonl.xz
1 1 8588 B 87.6 KiB 0.096 CRC64 data/Littleton-littleton_events-train-0.jsonl.xz
1 1 11.1 KiB 134.1 KiB 0.083 CRC64 data/Littleton-littleton_graph-train-0.jsonl.xz
1 1 544.2 KiB 34.7 MiB 0.015 CRC64 data/MAUD-answer-train-0.jsonl.xz
1 1 864.0 KiB 85.8 MiB 0.010 CRC64 data/MAUD-category-train-0.jsonl.xz
1 1 891.3 KiB 86.2 MiB 0.010 CRC64 data/MAUD-question-train-0.jsonl.xz
1 1 866.9 KiB 85.8 MiB 0.010 CRC64 data/MAUD-text_type-train-0.jsonl.xz
1 1 40.0 KiB 167.8 KiB 0.238 CRC64 data/MCExamsLaw-mc_exams_law_explain-train-0.jsonl.xz
1 1 28.2 KiB 114.9 KiB 0.246 CRC64 data/MCExamsLaw-mc_exams_law_no_explain-train-0.jsonl.xz
1 1 2091.3 KiB 43.3 MiB 0.047 CRC64 data/MiningLegalArguments-agent-train-0.jsonl.xz
1 1 2308.5 KiB 97.5 MiB 0.023 CRC64 data/MiningLegalArguments-argType-train-0.jsonl.xz
1 1 2063.9 KiB 12.1 MiB 0.167 CRC64 data/MultiLexSum-long_to_short-train-0.jsonl.xz
1 1 1171.9 KiB 6394.9 KiB 0.183 CRC64 data/MultiLexSum-long_to_tiny-train-0.jsonl.xz
1 1 230.9 KiB 1530.7 KiB 0.151 CRC64 data/MultiLexSum-short_to_tiny-train-0.jsonl.xz
1 1 9388.6 KiB 638.2 MiB 0.014 CRC64 data/NaturalInstructionsLegal-billsum_summarization-train-0.jsonl.xz
1 1 484.4 KiB 38.2 MiB 0.012 CRC64 data/NaturalInstructionsLegal-cail2018_answer_generation-train-0.jsonl.xz
1 1 3485.3 KiB 189.5 MiB 0.018 CRC64 data/NaturalInstructionsLegal-casehold_legal_answer_generation-train-0.jsonl.xz
1 1 3724.4 KiB 201.4 MiB 0.018 CRC64 data/NaturalInstructionsLegal-casehold_legal_incorrect_answer_generation-train-0.jsonl.xz
1 1 4300.6 KiB 228.0 MiB 0.018 CRC64 data/NaturalInstructionsLegal-cuad_answer_generation-train-0.jsonl.xz
1 1 4302.7 KiB 227.8 MiB 0.018 CRC64 data/NaturalInstructionsLegal-cuad_question_generation-train-0.jsonl.xz
1 1 201.9 KiB 22.7 MiB 0.009 CRC64 data/NaturalInstructionsLegal-eurlex_classification-train-0.jsonl.xz
1 1 284.3 KiB 19.9 MiB 0.014 CRC64 data/NaturalInstructionsLegal-eurlex_summarization-train-0.jsonl.xz
1 1 166.9 KiB 21.5 MiB 0.008 CRC64 data/NaturalInstructionsLegal-online_privacy_policy_text_information_type_generation-train-0.jsonl.xz
1 1 165.1 KiB 21.6 MiB 0.007 CRC64 data/NaturalInstructionsLegal-online_privacy_policy_text_purpose_answer_generation-train-0.jsonl.xz
1 1 246.8 KiB 16.4 MiB 0.015 CRC64 data/NaturalInstructionsLegal-overruling_legal_classification-train-0.jsonl.xz
1 1 5872.3 KiB 31.8 MiB 0.180 CRC64 data/OLCMemos-olc_memos-train-0.jsonl.xz
1 1 76.7 KiB 540.5 KiB 0.142 CRC64 data/PlainEnglishContractsSummarization-plain_english_contracts_summarization-train-0.jsonl.xz
1 1 1246.7 KiB 199.4 MiB 0.006 CRC64 data/PrivacyQA-privacy_qa-train-0.jsonl.xz
1 1 316.0 KiB 4538.1 KiB 0.070 CRC64 data/PrivacySummarization-privacy_summarization-train-0.jsonl.xz
1 1 1098.7 KiB 6969.7 KiB 0.158 CRC64 data/ReClor-reclor-train-0.jsonl.xz
1 1 50.5 MiB 412.2 MiB 0.123 CRC64 data/RedditLegalQA-reddit_legal_qa-train-0.jsonl.xz
1 1 11.0 KiB 134.9 KiB 0.082 CRC64 data/Sara-sara_entailment-train-0.jsonl.xz
1 1 11.5 KiB 145.1 KiB 0.079 CRC64 data/Sara-sara_tax_liability-train-0.jsonl.xz
1 1 36.6 KiB 586.5 KiB 0.062 CRC64 data/SaraProlog-sara_prolog_facts-train-0.jsonl.xz
1 1 18.2 KiB 132.2 KiB 0.138 CRC64 data/SaraProlog-sara_prolog_statute-train-0.jsonl.xz
1 1 90.3 KiB 1531.4 KiB 0.059 CRC64 data/ShortAnswerFeedback-short_answer_feedback_error_class-train-0.jsonl.xz
1 1 26.8 KiB 2218.5 KiB 0.012 CRC64 data/ShortAnswerFeedback-short_answer_feedback_openqa-train-0.jsonl.xz
1 1 91.0 KiB 1513.1 KiB 0.060 CRC64 data/ShortAnswerFeedback-short_answer_feedback_rating-train-0.jsonl.xz
1 1 16.0 KiB 118.9 KiB 0.135 CRC64 data/SpanishLaborLaw-spanish_labor_law-train-0.jsonl.xz
1 1 6562.3 KiB 31.0 MiB 0.207 CRC64 data/StackExchangeQuestionsLegal-stack_exchange_questions_legal-train-0.jsonl.xz
1 1 128.2 KiB 1080.4 KiB 0.119 CRC64 data/SwissCourtViewGeneration-swiss_judgment_court_view_generation_lower_court-train-0.jsonl.xz
1 1 901.2 MiB 5463.4 MiB 0.165 CRC64 data/SwissCourtViewGeneration-swiss_judgment_court_view_generation_same_court-train-0.jsonl.xz
1 1 211.4 MiB 1320.8 MiB 0.160 CRC64 data/SwissCriticalityPrediction-swiss_judgment_criticality-train-0.jsonl.xz
1 1 130.3 MiB 1984.4 MiB 0.066 CRC64 data/SwissJudgmentPrediction-swiss_judgment_multiple_choice-train-0.jsonl.xz
1 1 740.1 MiB 4651.8 MiB 0.159 CRC64 data/SwissJudgmentPredictionXL-swiss_judgment_dismiss_approve-train-0.jsonl.xz
1 1 39.3 MiB 252.1 MiB 0.156 CRC64 data/SwissLawAreaPrediction-swiss_judgment_area_of_law_main_area-train-0.jsonl.xz
1 1 39.3 MiB 252.2 MiB 0.156 CRC64 data/SwissLawAreaPrediction-swiss_judgment_area_of_law_sub_area-train-0.jsonl.xz
1 1 62.9 MiB 320.2 MiB 0.196 CRC64 data/SwissLeadingDecisions-swiss_judgment_location-train-0.jsonl.xz
1 1 60.3 MiB 398.0 MiB 0.152 CRC64 data/SwissLegislation-swiss_legislation_abbreviation-train-0.jsonl.xz
1 1 121.5 MiB 868.3 MiB 0.140 CRC64 data/SwissLegislation-swiss_legislation_canton-train-0.jsonl.xz
1 1 20.8 MiB 136.8 MiB 0.152 CRC64 data/SwissLegislation-swiss_legislation_short-train-0.jsonl.xz
1 1 121.8 MiB 872.3 MiB 0.140 CRC64 data/SwissLegislation-swiss_legislation_title-train-0.jsonl.xz
1 1 199.9 KiB 13.6 MiB 0.014 CRC64 data/TsccAlqac-tscc_alqac_question_answering-train-0.jsonl.xz
1 1 1221.8 KiB 20.3 MiB 0.059 CRC64 data/TurkishConstitutionalCourt-turkish_constitutional_multiple_choice-train-0.jsonl.xz
1 1 1130.7 KiB 10.0 MiB 0.110 CRC64 data/TurkishConstitutionalCourt-turkish_constitutional_violation_no_violation-train-0.jsonl.xz
1 1 3465.9 KiB 29.3 MiB 0.116 CRC64 data/USClassActions-us_class_actions_win_lose-train-0.jsonl.xz
1 1 94.7 KiB 2548.8 KiB 0.037 CRC64 data/ValidWills-valid_wills_entailment-train-0.jsonl.xz
-------------------------------------------------------------------------------
142 142 9.8 GiB 116.5 GiB 0.084 CRC64 142 files
```
## Dataset Creation
This dataset has been created by running the code from the [LawInstruct](https://github.com/JoelNiklaus/LawInstruct) repo.
For this public version of the dataset we removed the datasets CiviproQuestions, COLIEE, JECQA and MBE because of restrictive licenses.
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@misc{niklaus2024flawnt5,
title={FLawN-T5: An Empirical Examination of Effective Instruction-Tuning Data Mixtures for Legal Reasoning},
author={Joel Niklaus and Lucia Zheng and Arya D. McCarthy and Christopher Hahn and Brian M. Rosen and Peter Henderson and Daniel E. Ho and Garrett Honke and Percy Liang and Christopher Manning},
year={2024},
eprint={2404.02127},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@JoelNiklaus](https://github.com/joelniklaus) for adding this dataset.
|
Nexdata/Chinese_Mandarin_Speech_Synthesis_Corpus-Female_Imitating_Children | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Chinese_Mandarin_Speech_Synthesis_Corpus-Female_Imitating_Children
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1091?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Female audio data of adults imitating children, 6599 sentences in total and 6.78 hours. It is recorded by Chinese native speakers, with authentic accent and sweet sound. The phoneme coverage is balanced. Professional phonetician participates in the annotation. It precisely matches with the research and development needs of the speech synthesis.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1091?source=Huggingface
### Supported Tasks and Leaderboards
tts: The dataset can be used to train a model for Text to Speech (TTS).
### Languages
Mandarin Chinese
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
Abhitej5965/textToDDLQuery | ---
license: apache-2.0
---
|
zolak/twitter_dataset_81_1713222784 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 158945
num_examples: 389
download_size: 83905
dataset_size: 158945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Thefoodprocessor/recipe_new_with_features_full | ---
dataset_info:
features:
- name: recipe_original
dtype: string
- name: title_original
dtype: string
- name: title_cleaned
dtype: string
- name: recipe_new
dtype: string
- name: wine_type
dtype: string
- name: allergy_type
dtype: string
- name: diet_type
dtype: string
- name: holiday
dtype: string
- name: cuisine_type
dtype: string
- name: meal_type
dtype: string
- name: ingredients_alternatives
dtype: string
splits:
- name: train
num_bytes: 248827563
num_examples: 74465
download_size: 117992806
dataset_size: 248827563
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
inwaves/dtchess-standard | ---
license: mit
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo2_100_kl_0.1_prm_70m_thr_0.0_seed_2_t_1.0 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43551536
num_examples: 18929
- name: epoch_1
num_bytes: 44048505
num_examples: 18929
- name: epoch_2
num_bytes: 44112887
num_examples: 18929
- name: epoch_3
num_bytes: 44144097
num_examples: 18929
- name: epoch_4
num_bytes: 44162457
num_examples: 18929
- name: epoch_5
num_bytes: 44168530
num_examples: 18929
- name: epoch_6
num_bytes: 44174799
num_examples: 18929
- name: epoch_7
num_bytes: 44177081
num_examples: 18929
- name: epoch_8
num_bytes: 44181196
num_examples: 18929
- name: epoch_9
num_bytes: 44181960
num_examples: 18929
- name: epoch_10
num_bytes: 44183728
num_examples: 18929
- name: epoch_11
num_bytes: 44184624
num_examples: 18929
- name: epoch_12
num_bytes: 44184143
num_examples: 18929
- name: epoch_13
num_bytes: 44185718
num_examples: 18929
- name: epoch_14
num_bytes: 44184424
num_examples: 18929
- name: epoch_15
num_bytes: 44184799
num_examples: 18929
- name: epoch_16
num_bytes: 44185474
num_examples: 18929
- name: epoch_17
num_bytes: 44185636
num_examples: 18929
- name: epoch_18
num_bytes: 44185123
num_examples: 18929
- name: epoch_19
num_bytes: 44186926
num_examples: 18929
- name: epoch_20
num_bytes: 44186621
num_examples: 18929
- name: epoch_21
num_bytes: 44184941
num_examples: 18929
- name: epoch_22
num_bytes: 44185088
num_examples: 18929
- name: epoch_23
num_bytes: 44186826
num_examples: 18929
- name: epoch_24
num_bytes: 44187047
num_examples: 18929
- name: epoch_25
num_bytes: 44187605
num_examples: 18929
- name: epoch_26
num_bytes: 44186460
num_examples: 18929
- name: epoch_27
num_bytes: 44188504
num_examples: 18929
- name: epoch_28
num_bytes: 44187911
num_examples: 18929
- name: epoch_29
num_bytes: 44186570
num_examples: 18929
download_size: 698567953
dataset_size: 1324621216
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
Frixi/Luz_Noceda_Eng_6Mins | ---
license: openrail
---
|
ramgus/audiofeaturesalbumcovers | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 636771303.2
num_examples: 1200
download_size: 520392718
dataset_size: 636771303.2
---
# Dataset Card for "audiofeaturesalbumcovers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713069246 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 857621
num_examples: 2464
download_size: 329244
dataset_size: 857621
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Joshua-Abok/preprocessed_samsum_and_dialogsum | ---
dataset_info:
features:
- name: dialogue
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 19792641
num_examples: 20000
- name: valid
num_bytes: 1035442
num_examples: 1318
- name: test
num_bytes: 2013667
num_examples: 2319
download_size: 12309269
dataset_size: 22841750
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
NotHEre/Rafael | ---
license: openrail
---
|
nakcnx/sql-context-splitted | ---
dataset_info:
features:
- name: question
dtype: string
- name: context
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 16491906.06065388
num_examples: 74648
- name: test
num_bytes: 434125.4341601232
num_examples: 1965
- name: valid
num_bytes: 433904.5051859959
num_examples: 1964
download_size: 8589535
dataset_size: 17359936.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
zolak/twitter_dataset_80_1713058513 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3845885
num_examples: 9434
download_size: 1938363
dataset_size: 3845885
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
albertvillanova/tmp-multilingual | ---
language:
- multilingual
- mul
--- |
gcjavi/parlaspeech-tests | ---
configs:
- config_name: clean
data_files:
- split: train
path: "data/clean/train/train_clean.tsv"
#- "data/clean/train/train_clean_2.tsv"
#- "data/clean/train/train_clean_3.tsv"
#- "data/clean/train/train_clean_4.tsv"
- split: dev
path: "data/clean/dev/dev_clean.tsv"
- split: test
path: "data/clean/test/test_clean.tsv"
- config_name: other
data_files:
- split: train
path: "data/other/train/train_other.tsv"
- split: dev
path: "data/other/dev/dev_other.tsv"
- split: test
path: "data/other/test/test_other.tsv"
---
|
lsr42/msmarco-passage-doct5query | ---
license: unknown
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: passage
num_bytes: 16870476814
num_examples: 8841823
download_size: 5773175789
dataset_size: 16870476814
---
|
dynopii/OpenOrca-Top5percent | ---
language:
- en
license: mit
task_categories:
- text-classification
- token-classification
- table-question-answering
- question-answering
- zero-shot-classification
- summarization
- feature-extraction
- text-generation
- text2text-generation
pretty_name: OpenOrca-Top5Percent
size_categories:
- 1M<n<10M
---
<p><h1>🐋 The OpenOrca-Top5Percent Dataset! 🐋</h1></p>
We are excited to introduce the OpenOrca-Top5Percent dataset, a refined version of the original [OpenOrca dataset](https://huggingface.co/datasets/Open-Orca/OpenOrca). This dataset contains only those entries which utilize the top 5% most frequently used words in the OpenOrca dataset, aiming to focus on high-frequency vocabulary for various NLP tasks.
# Dataset Summary
The OpenOrca-Top5Percent dataset is a curated subset of the augmented [FLAN Collection data](https://arxiv.org/abs/2301.13688), focusing specifically on entries that incorporate the most commonly used words across ~1M GPT-4 completions and ~3.2M GPT-3.5 completions. It represents a narrowed scope with the intent of fostering research and applications where high-frequency vocabulary usage is critical.
# Dataset Attribution
This dataset builds upon the efforts and contributions of the OpenOrca dataset team and contributors. Special thanks to the original OpenOrca contributors, as well as the community around it, for making the foundational dataset available.
# Supported Tasks and Leaderboards
OpenOrca-Top5Percent supports a similar range of NLP tasks as the original dataset, particularly those benefiting from a focus on high-usage vocabulary, including but not limited to language modeling, text generation, summarization, and more. It offers a unique dataset for exploring the impacts of vocabulary frequency on various NLP tasks.
# Languages
The primary language of the dataset is English.
# Dataset Structure
## Data Instances
Each instance in this dataset reflects the structure of the original OpenOrca dataset but is specifically filtered to only include entries with the top 5% most used words, aiming to maintain the richness of the data while focusing on common vocabulary.
## Data Fields
Fields remain consistent with the original OpenOrca dataset, including 'id', 'system_prompt', 'question', and 'response', ensuring compatibility with existing models and tools designed for OpenOrca.
## Data Splits
The dataset is provided as a single, unsplit collection, simplifying use and access.
# Dataset Creation
## Curation Rationale
The creation of OpenOrca-Top5Percent is motivated by the desire to investigate the effects of focusing on high-frequency vocabulary in NLP tasks, potentially improving efficiency and performance in specific applications.
## Source Data
The source data for this dataset is derived from the original OpenOrca dataset, filtered to focus on entries containing only the top 5% most frequently used words.
# Dataset Use
## Use Cases
OpenOrca-Top5Percent is ideal for use cases where high-frequency vocabulary is of particular interest, including educational applications, simplified text generation, and more.
## Usage Caveats
As with any filtered dataset, users should consider the implications of the narrowed vocabulary scope on their specific applications and research.
## Getting Started
This dataset is structured for easy loading via the Hugging Face datasets library, with considerations for efficient use given its focus on high-frequency vocabulary. Users are encouraged to explore the potential of this specialized dataset in their work.
# Citation
Please cite the original OpenOrca dataset when using OpenOrca-Top5Percent in your research or applications, along with any specific papers or resources related to your work that utilize this dataset.
```bibtex
@misc{OpenOrca-Top5Percent,
title = {OpenOrca-Top5Percent: A Filtered Subset of OpenOrca Focusing on High-Frequency Vocabulary},
author = {Anubhav Singh},
year = {2023},
publisher = {Dynopii},
journal = {HuggingFace repository},
howpublished = {\url{https://huggingface.co/datasets/dynopii/OpenOrca-Top5percent}},
}
```
```bibtex
@misc{OpenOrca,
title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces},
author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/datasets/Open-Orca/OpenOrca}},
}
```
```bibtex
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
```bibtex
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint= arXiv 2307.09288
}
@software{touvron2023llama,
title={LLaMA: Open and Efficient Foundation Language Models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
```
--- |
distilled-from-one-sec-cv12/chunk_253 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 945062932
num_examples: 184151
download_size: 960503342
dataset_size: 945062932
---
# Dataset Card for "chunk_253"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hyokwan/dataset_llama_hk2 | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
richfrain/pokemon-blip-captions | ---
license: apache-2.0
---
|
Ivus234/Joao2 | ---
license: openrail
---
|
Gbssreejith/Birth_type1_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 96340182.0
num_examples: 249
- name: val
num_bytes: 10782724.0
num_examples: 28
download_size: 107068662
dataset_size: 107122906.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
tyzhu/lmind_nq_train300_eval100_v1_reciteonly_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 34574
num_examples: 300
- name: train_recite_qa
num_bytes: 226733
num_examples: 300
- name: eval_qa
num_bytes: 11254
num_examples: 100
- name: eval_recite_qa
num_bytes: 74768
num_examples: 100
- name: all_docs
num_bytes: 254478
num_examples: 392
- name: all_docs_eval
num_bytes: 254451
num_examples: 392
- name: train
num_bytes: 226733
num_examples: 300
- name: validation
num_bytes: 74768
num_examples: 100
download_size: 760921
dataset_size: 1157759
---
# Dataset Card for "lmind_nq_train300_eval100_v1_reciteonly_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ckandemir/amazon-products | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: Product Name
dtype: string
- name: Category
dtype: string
- name: Description
dtype: string
- name: Selling Price
dtype: string
- name: Product Specification
dtype: string
- name: Image
dtype: string
splits:
- name: train
num_bytes: 12542887
num_examples: 23993
- name: test
num_bytes: 3499375
num_examples: 6665
- name: eval
num_bytes: 1376174
num_examples: 2666
download_size: 6391314
dataset_size: 17418436
license: apache-2.0
task_categories:
- image-classification
- image-to-text
language:
- en
size_categories:
- 10K<n<100K
---
## Dataset Creation and Processing Overview
This dataset underwent a comprehensive process of loading, cleaning, processing, and preparing, incorporating a range of data manipulation and NLP techniques to optimize its utility for machine learning models, particularly in natural language processing.
### Data Loading and Initial Cleaning
- **Source**: Loaded from the Hugging Face dataset repository [bprateek/amazon_product_description](https://huggingface.co/datasets/bprateek/amazon_product_description).
- **Conversion to Pandas DataFrame**: For ease of data manipulation.
- **Null Value Removal**: Rows with null values in the 'About Product' column were discarded.
### Data Cleaning and NLP Processing
- **Sentence Extraction**: 'About Product' descriptions were split into sentences, identifying common phrases.
- **Emoji and Special Character Removal**: A regex function removed these elements from the product descriptions.
- **Common Phrase Elimination**: A function was used to strip common phrases from each product description.
- **Improving Writing Standards**: Adjusted capitalization, punctuation, and replaced '&' with 'and' for better readability and formalization.
### Sentence Similarity Analysis
- **Model Application**: The pre-trained Sentence Transformer model 'all-MiniLM-L6-v2' was used.
- **Sentence Comparison**: Identified the most similar sentence to each product name within the cleaned product descriptions.
### Dataset Refinement
- **Column Selection**: Retained relevant columns for final dataset.
- **Image URL Processing**: Split multiple image URLs into individual URLs, removing specific unwanted URLs.
### Image Validation
- **Image URL Validation**: Implemented a function to verify the validity of each image URL.
- **Filtering Valid Images**: Retained only rows with valid image URLs.
### Dataset Splitting for Machine Learning
- **Creation of Train, Test, and Eval Sets**: Used scikit-learn's `train_test_split` for dataset division.
For further details or to contribute to enhancing the dataset card, please refer to the [Hugging Face Dataset Card Contribution Guide](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards). |
tyzhu/find_word_1000 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 233196
num_examples: 3000
- name: eval_find_word
num_bytes: 53196
num_examples: 1000
download_size: 136283
dataset_size: 286392
---
# Dataset Card for "find_word_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlketaR/embedded_faqs_medicarealk | ---
license: openrail
---
|
linhtran92/asr_data_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 3656462.0
num_examples: 44
download_size: 3639719
dataset_size: 3656462.0
---
# Dataset Card for "asr_data_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hani89/Medical_ASR_45HRs | ---
license: apache-2.0
task_categories:
- automatic-speech-recognition
language:
- en
tags:
- medical
size_categories:
- 10K<n<100K
---
# Medical Dataset for ASR
The dataset is a part taken from [The MedDialog dataset](https://huggingface.co/datasets/medical_dialog). We used only icliniq_dialogue.txt and done some preprocessing:
- Remove all chars except for [a-z|A-Z|0-9|,|.].
- Break each conversation into rows of 32 to 35 words.
- Remove Duplication.
- Fix typos using GPT-3 instructons' model.
- Used Suno/Bark to create ~15K audio clips with different voices [*In Progress*]
#### Note:
- We are expecting about ~45 hours of medical audio clips.
- The dataset will be released soon, for any inqueries please contact me on(hmthubaiti@uqu.edu.sa) |
pk3388/train | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 312235.0
num_examples: 9
download_size: 247841
dataset_size: 312235.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Myashka/SO-Python_QA-filtered-2023-tanh_score-after_2023_02 | ---
task_categories:
- question-answering
language:
- en
size_categories:
- n<1K
---
SO dataset of `python`tag data
Question filters:
- images
- links
- code blocks
- Q_Score > 0
- Answer_count > 0
- CreationDate > 2023-02-01
Answers filters:
- images
- links
- code blocks
Scores are tanh applied to scaled with AbsMaxScaler to IQR range of Original SO Answers' scores |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/058e2bc3 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1339
dataset_size: 184
---
# Dataset Card for "058e2bc3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5 | ---
pretty_name: Evaluation run of decem/Dionysus-Mistral-m3-v5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [decem/Dionysus-Mistral-m3-v5](https://huggingface.co/decem/Dionysus-Mistral-m3-v5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T07:41:42.571559](https://huggingface.co/datasets/open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5/blob/main/results_2024-01-05T07-41-42.571559.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6119654723808889,\n\
\ \"acc_stderr\": 0.03286363978834633,\n \"acc_norm\": 0.6149242684593406,\n\
\ \"acc_norm_stderr\": 0.03352054924433844,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5093204017955075,\n\
\ \"mc2_stderr\": 0.015839968447220742\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186043,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.01434203648343618\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6283608842859988,\n\
\ \"acc_stderr\": 0.004822550638450897,\n \"acc_norm\": 0.8098984266082454,\n\
\ \"acc_norm_stderr\": 0.003915792315457797\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"\
acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117474,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117474\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094753,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073403,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073403\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489277,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489277\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n\
\ \"acc_stderr\": 0.014351702181636864,\n \"acc_norm\": 0.7982120051085568,\n\
\ \"acc_norm_stderr\": 0.014351702181636864\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531015,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531015\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n\
\ \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n\
\ \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02699254433929724,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02699254433929724\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n\
\ \"acc_stderr\": 0.012647695889547228,\n \"acc_norm\": 0.43089960886571055,\n\
\ \"acc_norm_stderr\": 0.012647695889547228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954854,\n \
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954854\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547738,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547738\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5093204017955075,\n\
\ \"mc2_stderr\": 0.015839968447220742\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403107\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.510235026535254,\n \
\ \"acc_stderr\": 0.013769598923012395\n }\n}\n```"
repo_url: https://huggingface.co/decem/Dionysus-Mistral-m3-v5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|arc:challenge|25_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|gsm8k|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hellaswag|10_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T07-41-42.571559.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T07-41-42.571559.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- '**/details_harness|winogrande|5_2024-01-05T07-41-42.571559.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T07-41-42.571559.parquet'
- config_name: results
data_files:
- split: 2024_01_05T07_41_42.571559
path:
- results_2024-01-05T07-41-42.571559.parquet
- split: latest
path:
- results_2024-01-05T07-41-42.571559.parquet
---
# Dataset Card for Evaluation run of decem/Dionysus-Mistral-m3-v5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [decem/Dionysus-Mistral-m3-v5](https://huggingface.co/decem/Dionysus-Mistral-m3-v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T07:41:42.571559](https://huggingface.co/datasets/open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5/blob/main/results_2024-01-05T07-41-42.571559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6119654723808889,
"acc_stderr": 0.03286363978834633,
"acc_norm": 0.6149242684593406,
"acc_norm_stderr": 0.03352054924433844,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5093204017955075,
"mc2_stderr": 0.015839968447220742
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186043,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.01434203648343618
},
"harness|hellaswag|10": {
"acc": 0.6283608842859988,
"acc_stderr": 0.004822550638450897,
"acc_norm": 0.8098984266082454,
"acc_norm_stderr": 0.003915792315457797
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117474,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117474
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094753,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073403,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073403
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489277,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489277
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636864,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636864
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531015,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531015
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360276,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02699254433929724,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02699254433929724
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547228,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.019506291693954854,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.019506291693954854
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547738,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547738
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5093204017955075,
"mc2_stderr": 0.015839968447220742
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403107
},
"harness|gsm8k|5": {
"acc": 0.510235026535254,
"acc_stderr": 0.013769598923012395
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Nexdata/29523_People_Face_Recognition_Data_with_Identification_Photos | ---
license: cc-by-nc-nd-4.0
---
## Description
29,523 People Face Recognition Data with Identification Photos.The race distribution of data includes Asian race, Caucasian race, black race and brown race. For each subject, one ID photo and 5-10 life photos were collected. This data can be used for face recognition.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1020?source=Huggingface
# Specifications
## Data size
29,523 people, one ID photo and 5-10 life photos per person
## Race distribution
2,099 black people, 2,238 Caucasian people, 841 brown (Mexicans) people and 24,345 Asian people
## Gender distribution
14,790 males, 14,733 females
## Age distribution:
ranging from teenager to the elderly, the middle-aged and young people are the majorities
## Collecting environment
including indoor and outdoor scenes
## Data diversity
different poses, races or nationality, ages and collecting scenes
## Device
cellphone
## Data format
.jpg, .jpeg, .png
## accuracy
the accuracy of labels of gender, race or nationality and age are more than 97%
# Licensing Information
Commercial License
|
garcianacho/DPI | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 106579679
num_examples: 150000
download_size: 96335003
dataset_size: 106579679
---
# Dataset Card for "DPI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roborovski/celeba-faces-captioned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: pixel_values
sequence:
sequence:
sequence: float32
- name: captions
dtype: string
splits:
- name: train
num_bytes: 17810785215.0
num_examples: 10000
download_size: 475025277
dataset_size: 17810785215.0
---
# Dataset Card for "celeba-faces-captioned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hpprc/jsquad-mined | ---
dataset_info:
features:
- name: passage_id
dtype: int64
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: mined_neg_ids
sequence: int64
- name: mined_neg_sims
sequence: float64
splits:
- name: train
num_bytes: 242044763
num_examples: 62859
download_size: 139992000
dataset_size: 242044763
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jan-hq/capybara_dpo_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 71347705.65807219
num_examples: 6806
- name: test
num_bytes: 7935676.341927807
num_examples: 757
download_size: 40834468
dataset_size: 79283382.0
---
# Dataset Card for "capybara_dpo_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Iceclear/AVA | ---
license: apache-2.0
---
AVA: A Large-Scale Database for Aesthetic Visual Analysis
See [Github Page](https://github.com/imfing/ava_downloader/tree/master/AVA_dataset) for tags.
## Citation
```bibtex
@inproceedings{murray2012ava,
title={AVA: A large-scale database for aesthetic visual analysis},
author={Murray, Naila and Marchesotti, Luca and Perronnin, Florent},
booktitle={CVPR},
year={2012},
}
``` |
jdabello/amontillado | ---
license: apache-2.0
---
|
DataStudio/OCR_underline_part_5 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1509228890.875
num_examples: 77465
download_size: 1510622408
dataset_size: 1509228890.875
---
# Dataset Card for "OCR_underline_part_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.