datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
shidowake/FreedomIntelligence_alpaca-gpt4-japanese_subset_split_6 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 4863217.322740098
num_examples: 4997
download_size: 2463978
dataset_size: 4863217.322740098
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thanaphatt1/semi-training_set-v2 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: word_ids
sequence: int64
- name: ner_tags
sequence: int64
- name: id
dtype: string
- name: fname
dtype: string
- name: pos_tags
sequence: int64
- name: clause_tags
sequence: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 27456882
num_examples: 18571
download_size: 2918093
dataset_size: 27456882
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/tohsaka_rin_fatekaleidlinerprismaillya | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tohsaka Rin
This is the dataset of Tohsaka Rin, containing 299 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 299 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 643 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 299 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 299 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 299 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 299 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 299 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 643 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 643 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 643 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
heliosprime/twitter_dataset_1713091440 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3061
num_examples: 9
download_size: 7688
dataset_size: 3061
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713091440"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ali9971/pumbeddata | ---
license: apache-2.0
---
|
TheSkullery/Aether-V1.5 | ---
language:
- en
license: apache-2.0
size_categories:
- 1M<n<10M
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: system
dtype: string
- name: tools
dtype: string
splits:
- name: train
num_bytes: 4655376981
num_examples: 2712289
download_size: 2446047146
dataset_size: 4655376981
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- not-for-all-audiences
---
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Data Card</title>
<link href="https://fonts.googleapis.com/css2?family=Quicksand:wght@400;500;600&display=swap" rel="stylesheet">
<style>
body {
font-family: 'Quicksand', sans-serif;
background-color: #1A202C;
color: #D8DEE9;
margin: 0;
padding: 0; /* Remove default padding */
font-size: 16px;
background: linear-gradient(135deg, #2E3440 0%, #1A202C 100%);
}
p {
padding-left: 10px
}
.container {
width: 100%;
margin: auto;
background-color: rgb(255 255 255 / 1%);
padding: 20px 30px 40px; /* Add padding below the image only */
padding-right: 32px;
border-radius: 12px;
box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2);
backdrop-filter: blur(10px);
border: 1px solid rgba(255, 255, 255, 0.05);
}
.header {
display: flex;
align-items: center;
justify-content: space-between;
gap: 20px;
}
img {
border-radius: 10px 10px 0 0!important;
padding-left: 0px !important;
}
.header h1 {
font-size: 28px;
color: #ECEFF4;
margin: 0;
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.3);
}
.info {
background-color: rgba(255, 255, 255, 0.05);
color: #AEBAC7;
border-radius: 12px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.2);
font-size: 14px;
line-height: 1.6;
margin-left: 5px;
overflow-x: auto;
margin-top: 20px; /* Adjusted margin */
border: 1px solid rgba(255, 255, 255, 0.05);
transition: background-color 0.6s ease; /* Smooth transition over 0.5 seconds */
}
.info:hover {
}
.info img {
width: 100%;
border-radius: 10px 10px 0 0;
margin-top: -20px; /* Negative margin to overlap container margin */
}
a {
color: #88C0D0;
text-decoration: none;
transition: color 0.3s ease;
position: relative;
}
a:hover {
color: #A3BE8C;
text-decoration: none;
}
a::before {
content: '';
position: absolute;
width: 100%;
height: 2px;
bottom: 0;
left: 0;
background-color: #A3BE8C;
visibility: hidden;
transform: scaleX(0);
transition: all 0.3s ease-in-out;
}
a:hover::before {
visibility: visible;
transform: scaleX(1);
}
.button {
display: inline-block;
background-color: #5E81AC;
color: #E5E9F0;
padding: 10px 20px;
border-radius: 5px;
cursor: pointer;
text-decoration: none;
transition: background-color 0.3s ease;
}
.button:hover {
background-color: #81A1C1;
}
</style>
</head>
<body>
<div class="container">
<div class="header">
<h1>Aether Dataset</h1>
</div>
<div class="info">
<img src="https://cdn-uploads.huggingface.co/production/uploads/64545af5ec40bbbd01242ca6/N4UWofDAapZ_kCraMuQDJ.webp" style="border-radius: 10px;">
<p><strong>Creator:</strong> <a href="https://huggingface.co/Steelskull" target="_blank">SteelSkull</a></p>
<p><strong>Community Organization:</strong> <a href="https://huggingface.co/ConvexAI" target="_blank">ConvexAI</a></p>
<p><strong>Discord:</strong> <a href="https://discord.gg/yYqmNmg7Wj" target="_blank">Join us on Discord</a></p>
</head>
<body>
<div>
<div>
<p><strong>About Aether:</strong> The Aether dataset.</p>
<p>rebuilt script, new dataset</p>
<p>from 1.2.2 to 1.5, changed datasets, added two.</p>
<p> version v1.5 is a rework of the human -> gpt conversations and added system and tool columns
<p><strong>Source Datasets:</strong></p>
<ul>
<li>grimulkan/bluemoon_Karen_cleaned</li>
<li>Doctor-Shotgun/no-robots-sharegpt</li>
<li>Locutusque/hercules-v2.5</li>
<li>jondurbin/airoboros-3.2</li>
<li>openerotica/freedom-rp</li>
<li>teknium/OpenHermes-2.5</li>
<li>Doctor-Shotgun/capybara-sharegpt</li>
<li>KaraKaraWitch/PIPPA-ShareGPT-formatted</li>
<li>Locutusque/bagel-clean-v0.3-shuffled</li>
</ul>
<p><strong>Phrases and Data Removed:</strong></p>
<p>To enhance the dataset's coherence and relevance across varied contexts, certain phrases have been selectively omitted. each dataset is run against a "keyed" list of phrases.
<p>Filtering Stats:
<p>Total Objects Removed: 72114
<p>
<p>Deduplication:
<p>Initial row count: 3296307
<p>Final row count: 2728791
<p>Rows removed: 567516
<p>Filter:
<ul>
<li>Couldn't help but</li>
<li>Can't resist</li>
<li>I'm sorry, but</li>
<li>As an AI</li>
<li>However, it is important to</li>
<li>Cannot provide</li>
<li>And others</li>
</ul>
</div>
</div>
</body> |
heliosprime/twitter_dataset_1713009658 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 10807
num_examples: 24
download_size: 9143
dataset_size: 10807
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713009658"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
myvision/CS4248-T15-LUN | ---
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 159234160
num_examples: 48854
- name: test
num_bytes: 9048910
num_examples: 3000
download_size: 104858010
dataset_size: 168283070
---
# Dataset Card for "CS4248-T15-LUN"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pankaja/microbe | ---
license: apache-2.0
---
|
Rewcifer/ct_scans_90pct_2048_cutoff | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 842235884.5219477
num_examples: 168647
download_size: 154765997
dataset_size: 842235884.5219477
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ct_scans_90pct_2048_cutoff"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-17000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 657590
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amithm3/shr | ---
dataset_info:
- config_name: audio
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 35605239.0
num_examples: 126
- name: test
num_bytes: 30002421.0
num_examples: 133
download_size: 65356135
dataset_size: 65607660.0
- config_name: default
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 35605255.0
num_examples: 126
- name: test
num_bytes: 30002438.0
num_examples: 133
download_size: 65356135
dataset_size: 65607693.0
- config_name: meta
features:
- name: file_name
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 96526
num_examples: 259
download_size: 32849
dataset_size: 96526
configs:
- config_name: audio
data_files:
- split: train
path: audio/train-*
- split: test
path: audio/test-*
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- config_name: meta
data_files:
- split: train
path: meta/train-*
---
|
Vinnyyw/Anymoney | ---
license: openrail
---
|
louisbrulenaudet/code-sport | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code du sport
source_datasets:
- original
pretty_name: Code du sport
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code du sport, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
seungalee1112/C_to_QA_T | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: refined_q
dtype: string
splits:
- name: train
num_bytes: 31255774
num_examples: 12503
download_size: 16313967
dataset_size: 31255774
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HuggingFaceM4/LLaVA-Instruct-150K | Invalid username or password. |
antoniopagnotts/block-world-problem-v1-llama2-1k | ---
license: mit
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-46000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1002867
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RaulSalinasHerr/chilean_touristic_data | ---
license: apache-2.0
task_categories:
- time-series-forecasting
language:
- es
size_categories:
- 100K<n<1M
--- |
huggingartists/billy-talent | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/billy-talent"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.222716 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/66f0650a5d8acadaed4292d6e3df6b9b.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/billy-talent">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Billy Talent</div>
<a href="https://genius.com/artists/billy-talent">
<div style="text-align: center; font-size: 14px;">@billy-talent</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/billy-talent).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/billy-talent")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|122| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/billy-talent")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
RoversX/Samantha-EN-CN-Converted-Dataset-V1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2905855
num_examples: 1000
download_size: 1705518
dataset_size: 2905855
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Samantha-EN-CN-Converted-Dataset-V1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KhalfounMehdi/Biorxiv_abstracts_large | ---
dataset_info:
features:
- name: abstract
dtype: string
splits:
- name: train
num_bytes: 33615443
num_examples: 21078
download_size: 18750994
dataset_size: 33615443
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Biorxiv_abstracts_large"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RGBD-SOD/rgbdsod_datasets | ---
dataset_info:
features:
- name: depth
dtype: image
- name: rgb
dtype: image
- name: gt
dtype: image
- name: name
dtype: string
config_name: v1
splits:
- name: train
num_bytes: 7378488019
num_examples: 8025
- name: validation
num_bytes: 4190272788
num_examples: 4600
download_size: 3506288426
dataset_size: 11568760807
---
# RGB-D Salient Object Detection Dataset (RGB-D SOD)
RGB-D Salient Object Detection (RGB-D SOD) aims to detect and segment objects that *visually attract the most human interest* from a pair of color and depth images.
## Train
- COME-8K [8025 samples]
## Dev
- COME-E [4600 samples]
## Test
- Coming soon
## How to use
~~~python
from datasets import load_dataset
dataset = load_dataset(
"RGBD-SOD/rgbdsod_datasets", "v1", split="train", cache_dir="data"
)
print(dataset[0])
~~~
## BibTeX entry and citation info
```bibtex
@inproceedings{zhang2021rgb,
title={RGB-D saliency detection via cascaded mutual information minimization},
author={Zhang, Jing and Fan, Deng-Ping and Dai, Yuchao and Yu, Xin and Zhong, Yiran and Barnes, Nick and Shao, Ling},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
pages={4338--4347},
year={2021}
}
```
|
Deojoandco/capstone_fromgpt_without_gold_v6 | ---
dataset_info:
features:
- name: dialog_id
dtype: int64
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: gold_tags
dtype: string
- name: gpt_success
dtype: bool
- name: gpt_response
dtype: string
- name: gold_tags_tokens_count
dtype: int64
- name: GPT_TAGS_FOUND
dtype: bool
- name: gpt_output_tags
dtype: string
- name: gpt_output_tag_tokens_count
dtype: int64
- name: GPT_MI_FOUND
dtype: bool
- name: gpt_tags_token_count
dtype: int64
- name: gpt_tags
dtype: string
- name: tag_token_count_match
dtype: bool
splits:
- name: test
num_bytes: 20174
num_examples: 12
download_size: 21461
dataset_size: 20174
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "capstone_fromgpt_without_gold_v6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jambroz__sixtyoneeighty-4x7B-v1 | ---
pretty_name: Evaluation run of jambroz/sixtyoneeighty-4x7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jambroz/sixtyoneeighty-4x7B-v1](https://huggingface.co/jambroz/sixtyoneeighty-4x7B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jambroz__sixtyoneeighty-4x7B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T22:52:01.679205](https://huggingface.co/datasets/open-llm-leaderboard/details_jambroz__sixtyoneeighty-4x7B-v1/blob/main/results_2024-04-05T22-52-01.679205.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6268881277247458,\n\
\ \"acc_stderr\": 0.03259725588386801,\n \"acc_norm\": 0.6303417998295023,\n\
\ \"acc_norm_stderr\": 0.03325791691681224,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5619650605574061,\n\
\ \"mc2_stderr\": 0.015325596612041551\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979279,\n\
\ \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.01396014260059867\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6587333200557658,\n\
\ \"acc_stderr\": 0.004731657228906993,\n \"acc_norm\": 0.8425612427803226,\n\
\ \"acc_norm_stderr\": 0.0036346959069096605\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406772,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406772\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n\
\ \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n\
\ \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295824,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295824\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565438,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990925,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990925\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n\
\ \"acc_stderr\": 0.016578997435496713,\n \"acc_norm\": 0.4346368715083799,\n\
\ \"acc_norm_stderr\": 0.016578997435496713\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n\
\ \"acc_stderr\": 0.012712265105889133,\n \"acc_norm\": 0.45241199478487615,\n\
\ \"acc_norm_stderr\": 0.012712265105889133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.032200241045342054,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.032200241045342054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5619650605574061,\n\
\ \"mc2_stderr\": 0.015325596612041551\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.01115114504221833\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45943896891584535,\n \
\ \"acc_stderr\": 0.013727093010429788\n }\n}\n```"
repo_url: https://huggingface.co/jambroz/sixtyoneeighty-4x7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|arc:challenge|25_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|gsm8k|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hellaswag|10_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T22-52-01.679205.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T22-52-01.679205.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- '**/details_harness|winogrande|5_2024-04-05T22-52-01.679205.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T22-52-01.679205.parquet'
- config_name: results
data_files:
- split: 2024_04_05T22_52_01.679205
path:
- results_2024-04-05T22-52-01.679205.parquet
- split: latest
path:
- results_2024-04-05T22-52-01.679205.parquet
---
# Dataset Card for Evaluation run of jambroz/sixtyoneeighty-4x7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jambroz/sixtyoneeighty-4x7B-v1](https://huggingface.co/jambroz/sixtyoneeighty-4x7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jambroz__sixtyoneeighty-4x7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T22:52:01.679205](https://huggingface.co/datasets/open-llm-leaderboard/details_jambroz__sixtyoneeighty-4x7B-v1/blob/main/results_2024-04-05T22-52-01.679205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6268881277247458,
"acc_stderr": 0.03259725588386801,
"acc_norm": 0.6303417998295023,
"acc_norm_stderr": 0.03325791691681224,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5619650605574061,
"mc2_stderr": 0.015325596612041551
},
"harness|arc:challenge|25": {
"acc": 0.6177474402730375,
"acc_stderr": 0.014200454049979279,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.01396014260059867
},
"harness|hellaswag|10": {
"acc": 0.6587333200557658,
"acc_stderr": 0.004731657228906993,
"acc_norm": 0.8425612427803226,
"acc_norm_stderr": 0.0036346959069096605
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406772,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406772
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295824,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295824
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565438,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990925,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990925
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.016578997435496713,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.016578997435496713
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889133,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696647,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.032200241045342054,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.032200241045342054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5619650605574061,
"mc2_stderr": 0.015325596612041551
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.01115114504221833
},
"harness|gsm8k|5": {
"acc": 0.45943896891584535,
"acc_stderr": 0.013727093010429788
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tomber0/mc-Nasos | ---
license: mit
---
Такой себе репозиторий, да и модель, если честно, не очень.
<div align="center">
<a href="https://www.youtube.com/@Nostoro">
<img src="https://huggingface.co/datasets/tomber0/mc-Nasos/resolve/main/underrailicon1.png" /><br>
</a>
</div> |
datahrvoje/twitter_dataset_1712991684 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20511
num_examples: 45
download_size: 12381
dataset_size: 20511
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
parsee-mizuhashi/sdmusic-test | ---
license: mit
---
|
BangumiBase/beasttamer | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Beast Tamer
This is the image base of bangumi Beast Tamer, we detected 25 characters, 1727 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 46 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 24 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 15 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 411 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 13 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 8 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 12 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 17 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 8 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 201 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 25 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 41 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 21 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 17 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 317 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 10 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 231 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 10 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 14 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 50 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 22 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 37 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 14 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 38 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 125 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
jhhon80/jhonathan | ---
license: openrail
---
|
bookbot/id_word2phoneme | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- id
- ms
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
pretty_name: ID Word2Phoneme
---
# Dataset Card for ID Word2Phoneme
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Github](https://github.com/open-dict-data/ipa-dict/blob/master/data/ma.txt)
- **Repository:** [Github](https://github.com/open-dict-data/ipa-dict/blob/master/data/ma.txt)
- **Point of Contact:**
- **Size of downloaded dataset files:**
- **Size of the generated dataset:**
- **Total amount of disk used:**
### Dataset Summary
Originally a [Malay/Indonesian Lexicon](https://github.com/open-dict-data/ipa-dict/blob/master/data/ma.txt) retrieved from [ipa-dict](https://github.com/open-dict-data/ipa-dict). We removed the accented letters (because Indonesian graphemes do not use accents), separated homographs, and removed backslashes in phonemes -- resulting in a word-to-phoneme dataset.
### Languages
- Indonesian
- Malay
## Dataset Structure
### Data Instances
| word | phoneme |
| ----- | ------- |
| aba | aba |
| ab | ab |
| ab’ad | abʔad |
| abad | abad |
| abadi | abadi |
| ... | ... |
### Data Fields
- `word`: Word (grapheme) as a string.
- `phoneme`: Phoneme (IPA) as a string.
### Data Splits
| train |
| ----- |
| 27553 |
## Additional Information
### Citation Information
```
@misc{open-dict-data-no-date,
author = {{Open-Dict-Data}},
title = {{GitHub - open-dict-data/ipa-dict: Monolingual wordlists with pronunciation information in IPA}},
url = {https://github.com/open-dict-data/ipa-dict},
}
```
|
PsiPi/PascalQnA100 | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- en
tags:
- code
pretty_name: pascal100
size_categories:
- n<1K
---
100 Pascal Q and A
60% with an input string of some kind |
BubbleJoe/sms_generated_mistral_v01 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1041612
num_examples: 2034
download_size: 321498
dataset_size: 1041612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Eredim/autotrain-data-clasificacion_pisicinas | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: clasificacion_pisicinas
## Dataset Description
This dataset has been automatically processed by AutoTrain for project clasificacion_pisicinas.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<11x10 RGB PIL image>",
"target": 1
},
{
"image": "<12x15 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['psicina', 'psicinas', 'tierra'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 255 |
| valid | 108 |
|
Isaak-Carter/MAIN-function_calling_private | ---
dataset_info:
features:
- name: sample
dtype: string
splits:
- name: train
num_bytes: 279535171
num_examples: 101469
download_size: 107837732
dataset_size: 279535171
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LawBERT-tw/law_news | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1487522
num_examples: 1838
download_size: 950859
dataset_size: 1487522
---
# Dataset Card for "law_news"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aryananand19/construction | ---
license: mit
---
|
one-sec-cv12/chunk_113 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 28913041296.625
num_examples: 301027
download_size: 26649851879
dataset_size: 28913041296.625
---
# Dataset Card for "chunk_113"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chrystians/oasst1_pl_2 | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 67590476
num_examples: 81037
- name: validation
num_bytes: 2432688
num_examples: 3001
download_size: 20433061
dataset_size: 70023164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
musfiqdehan/preprocessed-BanglaNMT-sm | ---
license: cc-by-4.0
---
|
NLPC-UOM/Sentiment-tagger | ---
language:
- si
license:
- mit
---
*Sentiment Analysis of Sinhala News Comments*
Dataset used in this project is collected by crawling Sinhala online news sites, mainly www.lankadeepa.lk.
contact
Please contact us if you need more information.
Surangika Ranathunga-surangika@cse.mrt.ac.lk
Isuru Liyanage-theisuru@gmail.com
https://github.com/theisuru/sentiment-tagger
cite
If you use this data please cite this work
Ranathunga, S., & Liyanage, I. U. (2021). Sentiment Analysis of Sinhala News Comments. Transactions on Asian and Low-Resource Language Information Processing, 20(4), 1-23.
|
ShoukanLabs/AniSpeech | ---
language:
- en
license: mit
size_categories:
- n<1K
task_categories:
- text-to-speech
pretty_name: AniSpeech
tags:
- anime
- speech
- text-to-speech
- voice
dataset_info:
features:
- name: audio
dtype: audio
- name: caption
dtype: string
- name: phonetic captions
dtype: string
- name: voice
dtype: string
splits:
- name: ENGLISH
num_bytes: 18875728249.368
num_examples: 23656
download_size: 20449215803
dataset_size: 18875728249.368
configs:
- config_name: default
data_files:
- split: ENGLISH
path: data/ENGLISH-*
---
# AniSpeech Dataset
Welcome to the AniSpeech dataset, a continually expanding collection of captioned anime voices brought to you by ShoukanLabs.
- As we label more and more audio, they'll automagically be uploaded here for use, seperated by language
---
## ANNOUNCMENTS:
- An upcoming update will add an immense ammount of data to the dataset... however... because we cannot manually go through this dataset we have had to rely on manual quality estimation, as such, speaker splits may be innacurate, this shouldnt impact finetuning multispeaker models, but when training single speaker models you may have to listen to multiple speakers to find missing data, we plan on eventually completely overhauling this dataset eventually
## Key Features
- **LJSpeech Format Compatibility:** The captions in this dataset can be converted to (recent changes have sacrificed native LJSpeech support for better captions) comply with the LJSpeech format, and we plan to offer conversion scripts to said format eventually.
- **Diverse Anime Voices:** Train your TTS models on high-quality vocal performances with variations in intonation, timbre, and pitch. The dataset offers a rich assortment of anime voices for creating generalised models.
- **Ideal for Generalized Models:** AniSpeech is a perfect choice for fine-tuning generalized models. With a diverse range of voices, it provides a solid foundation for training models that can handle a wide variety of speaking styles (all speakers are labeled with a seperate speaker id).
## Limitations
- **Single-Voice Fine-Tuning:** While AniSpeech excels in training foundation models (due to it's diversity), it's not recommended for fine-tuning on a single voice. Its strength lies in contributing to the development of versatile TTS models.
- **Dataset Curation:** Due to its size, manually curating the entire dataset can be impractical. If you encounter low-quality files or incorrect captions, we encourage you to contribute by creating a pull request to help maintain and improve the dataset.
## License
This dataset is released under the [MIT License](https://huggingface.co/datasets/ShoukanLabs/AniSpeech/raw/main/license).
Your contributions to the AniSpeech dataset are invaluable, and we appreciate your efforts in advancing the field of Text-to-Speech technology.
Happy coding and synthesizing!
|
NickKolok/regs-nextphoto | ---
license: gpl-3.0
---
|
epsilonai/Dexter_Grif | ---
task_categories:
- feature-extraction
language:
- en
tags:
- music
pretty_name: Dexter Grif
--- |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_3_t_0.5 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43662007
num_examples: 18928
- name: epoch_1
num_bytes: 44132843
num_examples: 18928
- name: epoch_2
num_bytes: 44207080
num_examples: 18928
- name: epoch_3
num_bytes: 44255371
num_examples: 18928
- name: epoch_4
num_bytes: 44273197
num_examples: 18928
- name: epoch_5
num_bytes: 44280253
num_examples: 18928
- name: epoch_6
num_bytes: 44277798
num_examples: 18928
- name: epoch_7
num_bytes: 44278037
num_examples: 18928
- name: epoch_8
num_bytes: 44273381
num_examples: 18928
- name: epoch_9
num_bytes: 44271632
num_examples: 18928
- name: epoch_10
num_bytes: 44270921
num_examples: 18928
- name: epoch_11
num_bytes: 44270373
num_examples: 18928
- name: epoch_12
num_bytes: 44268355
num_examples: 18928
- name: epoch_13
num_bytes: 44269373
num_examples: 18928
- name: epoch_14
num_bytes: 44269604
num_examples: 18928
- name: epoch_15
num_bytes: 44270869
num_examples: 18928
- name: epoch_16
num_bytes: 44271077
num_examples: 18928
- name: epoch_17
num_bytes: 44269424
num_examples: 18928
- name: epoch_18
num_bytes: 44271250
num_examples: 18928
- name: epoch_19
num_bytes: 44269801
num_examples: 18928
- name: epoch_20
num_bytes: 44270673
num_examples: 18928
- name: epoch_21
num_bytes: 44269581
num_examples: 18928
- name: epoch_22
num_bytes: 44271005
num_examples: 18928
- name: epoch_23
num_bytes: 44270821
num_examples: 18928
- name: epoch_24
num_bytes: 44270189
num_examples: 18928
- name: epoch_25
num_bytes: 44269261
num_examples: 18928
- name: epoch_26
num_bytes: 44270226
num_examples: 18928
- name: epoch_27
num_bytes: 44271066
num_examples: 18928
- name: epoch_28
num_bytes: 44271734
num_examples: 18928
- name: epoch_29
num_bytes: 44271600
num_examples: 18928
download_size: 685464577
dataset_size: 1327318802
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
joey234/mmlu-business_ethics-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 11347
num_examples: 5
- name: test
num_bytes: 1323050
num_examples: 100
download_size: 131380
dataset_size: 1334397
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-business_ethics-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tejasvaidhya/testing | ---
dataset_info:
features:
- name: image
dtype: image
- name: ocr_annotation_texts
dtype: string
- name: image_height
dtype: int64
- name: image_width
dtype: int64
configs:
- config_name: default
data_files:
- split: test
path: testing.parquet
---
|
james-burton/OrientalMuseum_min3-3DwhiteTVT-name | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: label
dtype:
class_label:
names:
'0': Aegis
'1': Ajaeng Holder
'2': Album Painting
'3': Amulet Mould
'4': Animal Figurine
'5': Animal Mummy
'6': Animal bone
'7': Arm Guard
'8': Axe Head
'9': Axle-caps
'10': Ball
'11': Ballista Bolt
'12': Band
'13': Basin
'14': Baton
'15': Bead Net
'16': Belt Hook
'17': Betel Nut Cutter
'18': Blouse
'19': Blu-ray disc
'20': Bolt
'21': Book Cover
'22': Box
'23': Brush Pot
'24': Brush Rest
'25': Brush Tray
'26': Bulb Bowl
'27': Bullet Mould
'28': Burnisher
'29': Cabinet
'30': Cannon
'31': Cap
'32': Carved stone
'33': Case
'34': Cash Box
'35': Chest
'36': Cigar Holder
'37': Clapper
'38': Clay pipe (smoking)
'39': Comb
'40': Compass
'41': Cosmetic and Medical Equipment and Implements
'42': Counterpoise
'43': Cricket pot
'44': Cross-bow Lock
'45': Cup And Saucer
'46': Cup, Saucer
'47': Cushion Cover
'48': DVDs
'49': Dagger
'50': Dice Box
'51': Dice Shaker
'52': Disc
'53': Domestic Equipment and Utensils
'54': Double Dagger
'55': Dummy
'56': Ear Protector
'57': Ear Stud
'58': Earring
'59': Elephant Goad
'60': Erotic Figurine
'61': Eye Protector
'62': Fan Case
'63': Feet Protector
'64': Ferrous object
'65': Figurine Mould
'66': File
'67': Finger Ring
'68': Fitting
'69': Flannel
'70': Flute
'71': Funerary Cone
'72': Funerary goods
'73': Funerary money
'74': Furosode
'75': Greek crosses
'76': Hand Jade
'77': Hand Protector
'78': Handwarmer
'79': Hanging
'80': Headband
'81': Heart Scarab
'82': Human Figurine
'83': Incense Holder
'84': Inkstick
'85': Jue (jade)
'86': Kite
'87': Knee Protector
'88': Kohl Pot
'89': Kundika
'90': Leaflet
'91': Leg
'92': Leg Protector
'93': Letter
'94': Lock
'95': Mah Jong Rack
'96': Majiang set
'97': Manuscript Page
'98': Massager
'99': Mat
'100': Mica Painting
'101': Miniature Painting
'102': Miniature Portrait
'103': Mortar
'104': Mould
'105': Mouth Jade
'106': Mouth Protector
'107': Mouth-piece
'108': Mummy Label
'109': Nail Protector
'110': Neck Guard
'111': Nose Protector
'112': Opium Pipe
'113': Opium Weight
'114': Oracle Bone
'115': Ostraka
'116': Paddle
'117': Palette
'118': Panel
'119': Part
'120': Pelmet
'121': Pencase
'122': Pendant
'123': Perfumer
'124': Phallus Protector
'125': Phylactery
'126': Pigstick
'127': Pipe
'128': Pipe Case
'129': Pipe Holder
'130': Pith Painting
'131': Plaque
'132': Plate
'133': Poh Kam
'134': Pounder
'135': Prayer Wheel
'136': Quoit
'137': Rank Square
'138': Rubber
'139': Sake Cup
'140': Scabbard Chape
'141': Scabbard Slide
'142': Scarab Seal
'143': Scarf
'144': Score Board
'145': Screen
'146': Seal
'147': Seal Paste Pot
'148': Shaft Terminal
'149': Shield
'150': Shroud Weight
'151': Sleeve Band
'152': Sleeve Weight
'153': Slide
'154': Soles
'155': Spillikins
'156': Staff Head
'157': Stamp
'158': Stand
'159': Stand of Incense Burner
'160': Stem Bowl
'161': Stem Cup
'162': Story Cloth
'163': Strainer
'164': Sword Guard
'165': Sword Knob
'166': T-shirts
'167': Table
'168': Table Runner
'169': Thangka
'170': Throwing Stick
'171': Tomb Figure
'172': Tomb Model
'173': Tongue Protector
'174': Washer
'175': Water Dropper
'176': Water Pot
'177': Wine Pot
'178': Womb Protector
'179': Woodblock Print
'180': Writing Desk
'181': accessories
'182': adzes
'183': alabastra
'184': albums
'185': altar components
'186': altars
'187': amphorae
'188': amulets
'189': anchors
'190': animation cels
'191': animation drawings
'192': anklets
'193': armbands
'194': armor
'195': armrests
'196': arrowheads
'197': arrows
'198': autograph albums
'199': axes
'200': 'axes: woodworking tools'
'201': back scratchers
'202': badges
'203': bags
'204': balances
'205': bandages
'206': bangles
'207': banners
'208': baskets
'209': beads
'210': beakers
'211': bedspreads
'212': bells
'213': belts
'214': bezels
'215': bi
'216': blades
'217': blowguns
'218': board games
'219': boats
'220': boilers
'221': bone
'222': booklets
'223': books
'224': bottles
'225': bowls
'226': boxes
'227': bracelets
'228': bread
'229': brick
'230': brooches
'231': brush washers
'232': brushes
'233': buckets
'234': buckles
'235': business cards
'236': buttons
'237': caddies
'238': calendars
'239': calligraphy
'240': candelabras
'241': candleholders
'242': candlesticks
'243': canopic jars
'244': card cases
'245': card tables
'246': cards
'247': carvings
'248': cases
'249': cash
'250': celestial globes
'251': censers
'252': chains
'253': chairs
'254': charms
'255': charts
'256': chess sets
'257': chessmen
'258': chisels
'259': chokers
'260': chopsticks
'261': cigarette cases
'262': cigarette holders
'263': cippi
'264': clamps
'265': clappers
'266': claypipe
'267': cloth
'268': clothing
'269': coats
'270': coffins
'271': coins
'272': collar
'273': combs
'274': compact discs
'275': containers
'276': coverings
'277': covers
'278': crucifixes
'279': cuffs
'280': cups
'281': cushions
'282': cutlery
'283': cylinder seals
'284': deels
'285': deity figurine
'286': diagrams
'287': dice
'288': dishes
'289': document containers
'290': documents
'291': dolls
'292': doors
'293': drawings
'294': dresses
'295': dressing gowns
'296': drums
'297': dung-chen
'298': earrings
'299': embroidery
'300': ensembles
'301': envelopes
'302': 'equipment for personal use: grooming, hygiene and health care'
'303': ewers
'304': fans
'305': fasteners
'306': 'feet: furniture components'
'307': female figurine
'308': ferrules
'309': fiddles
'310': figures
'311': figurines
'312': finials
'313': fishhooks
'314': flagons
'315': flags
'316': flasks
'317': flint
'318': fragments
'319': funnels
'320': furniture components
'321': gameboards
'322': games
'323': gaming counters
'324': ge
'325': glassware
'326': gloves
'327': goblets
'328': gongs
'329': gowns
'330': greeting cards
'331': hair ornaments
'332': hairpins
'333': hammerstones
'334': handkerchiefs
'335': handles
'336': handscrolls
'337': hanging scrolls
'338': harnesses
'339': hatpins
'340': hats
'341': headdresses
'342': headrests
'343': heads
'344': headscarves
'345': helmets
'346': hobs
'347': hoods
'348': hooks
'349': houses
'350': identity cards
'351': illuminated manuscripts
'352': incense burners
'353': incense sticks
'354': ink bottles
'355': inkstands
'356': inkstones
'357': inkwells
'358': inlays
'359': iron
'360': jackets
'361': jar seal
'362': jars
'363': jewelry
'364': jue
'365': juglets
'366': jugs
'367': kayagum
'368': keys
'369': kimonos
'370': knives
'371': kŏmun'gos
'372': ladles
'373': lamps
'374': lanterns
'375': lanyards
'376': leatherwork
'377': lids
'378': lockets
'379': loom weights
'380': maces
'381': manuscripts
'382': maps
'383': maquettes
'384': masks
'385': medals
'386': miniatures
'387': mirrors
'388': miscellaneous
'389': models
'390': money
'391': mortarboards
'392': mounts
'393': mugs
'394': mummies
'395': musical instruments
'396': nails
'397': necklaces
'398': needles
'399': netsukes
'400': nozzles
'401': obelisks
'402': obis
'403': oboes
'404': oil lamps
'405': ornaments
'406': overdresses
'407': pages
'408': paintings
'409': paper money
'410': paperweights
'411': papyrus
'412': passports
'413': pectorals
'414': pendants
'415': pennants
'416': pestles
'417': petticoats
'418': photograph albums
'419': photographs
'420': pictures
'421': pins
'422': pipes
'423': pitchers
'424': plaques
'425': plaster
'426': playing card boxes
'427': playing cards
'428': plinths
'429': plumb bobs
'430': plumbing fixtures
'431': plume holders
'432': poker
'433': pommels
'434': postage stamps
'435': postcards
'436': posters
'437': pots
'438': pottery
'439': prayer beads
'440': prayers
'441': printing blocks
'442': printing plates
'443': prints
'444': punch bowls
'445': puppets
'446': purses
'447': puzzles
'448': pyxides
'449': quilts
'450': rag-dung
'451': razors
'452': reliefs
'453': rifles
'454': rings
'455': robes
'456': roofing tile
'457': rosaries
'458': rose bowls
'459': rubbings
'460': rugs
'461': rulers
'462': sandals
'463': saris
'464': sarongs
'465': sashes
'466': sauceboats
'467': saucers
'468': saws
'469': scabbards
'470': scaraboids
'471': scarabs
'472': scarves
'473': scepters
'474': scissors
'475': scrolls
'476': sculpture
'477': seed
'478': seppa
'479': shadow puppets
'480': shawls
'481': shears
'482': shell
'483': shelves
'484': sherds
'485': shields
'486': shoes
'487': shrines
'488': sistra
'489': situlae
'490': sketches
'491': skewers
'492': skirts
'493': snuff bottles
'494': socks
'495': spatulas
'496': spearheads
'497': spears
'498': spittoons
'499': spoons
'500': stampers
'501': staples
'502': statues
'503': statuettes
'504': steelyards
'505': stelae
'506': sticks
'507': stirrup jars
'508': stools
'509': stoppers
'510': straps
'511': studs
'512': styluses
'513': sugar bowls
'514': sugar tongs
'515': swagger sticks
'516': swords
'517': tablecloths
'518': tablets
'519': tacks
'520': talismans
'521': tallies
'522': tangrams
'523': tankards
'524': tea bowls
'525': tea caddies
'526': tea kettles
'527': teacups
'528': teapots
'529': telephones
'530': ties
'531': tiles
'532': toggles
'533': toilet caskets
'534': tools
'535': toys
'536': trays
'537': trimming
'538': trophies
'539': trousers
'540': trumpets
'541': tubes
'542': tureens
'543': tweezers
'544': typewriters
'545': underdresses
'546': underwear
'547': unidentified
'548': urinals
'549': ushabti
'550': utensils
'551': vases
'552': veils
'553': vessels
'554': votive offerings
'555': waistcoats
'556': wall tile
'557': watches
'558': weighing devices
'559': weight
'560': weights
'561': whetstones
'562': whistles
'563': whorls
'564': wire
'565': wood blocks
'566': writing boards
'567': xylophones
- name: other_name
dtype: string
- name: material
dtype: string
- name: production.period
dtype: string
- name: production.place
dtype: string
- name: new_root
dtype: string
splits:
- name: validation
num_bytes: 173148663.257
num_examples: 5489
- name: test
num_bytes: 161503107.568
num_examples: 5489
- name: train
num_bytes: 3091343456.875
num_examples: 116625
download_size: 3366881100
dataset_size: 3425995227.7
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
nuprl/pass_k_with_MultiPL-E | ---
dataset_info:
features:
- name: Experiment
dtype: string
- name: K
dtype: int64
- name: PassRate
dtype: float64
splits:
- name: train
num_bytes: 64770
num_examples: 690
download_size: 8011
dataset_size: 64770
---
# Dataset Card for "pass_k_with_MultiPL-E"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_180 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 973052860.0
num_examples: 189605
download_size: 997573493
dataset_size: 973052860.0
---
# Dataset Card for "chunk_180"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
stoddur/referral_commands_1B1 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1544000
num_examples: 1000
- name: eval
num_bytes: 1544000
num_examples: 1000
download_size: 189073
dataset_size: 3088000
---
# Dataset Card for "referral_commands_1B1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mrturan/Youtube | ---
license: other
---
|
Naomibas/llm-system-prompts-benchmark | ---
license: apache-2.0
language:
- en
pretty_name: 100 system prompts for benchmarking large language models
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This datset is a collection of 100 system prompts for large language models.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
These 100 system prompts test a model's ability to follow grammatical patterns; answer basic multiple choice questions; act according to a particular persona; memorize information; and speak in French.
Files:
- **hundred_system_prompts.py**: refer to this to see the (prompt, probe, function) triplets, as well as the helper functions.
- **hundred_system_prompts.json**: this is purely for display purposes.
- **run_benchmark.py**: this runs the 100 tests on a model, without any context other than the system prompt and the probe.
- **create_json_file.py**: a small file that was used to create the **hundred_system_prompts.py** file.
More info:
- **Curated by:** Naomi Bashkansky
- **Language(s) (NLP):** en
- **License:** apache-2.0
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/likenneth/persona
- **Paper:** Forthcoming.
## Uses
A benchmark for large language models: how good are LLMs at following a system prompt? Tests both basic capabilities (is a model able to follow the system prompt) and basic alignment (does a model that *can* follow the system prompt do so).
Can be used to compare different models, or to help in performing interventions on a model to make it better at following system prompts.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset is released open source. Researchers are especially encouraged to use this dataset.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
"prompt" is given as a system prompt to a large language model. "probe" is given as a user inquiry; its purpose it to elicit a response that allows us to check if the LLM is following the system prompt. "function" checks whether the LLM's response to the probe follows the system prompt; it returns a number from 0 (not following) to 1 (following).
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
There exists no benchmark of system prompts.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
Process: thinking of system prompts, probes, and testing functions. Running the system prompts on GPT-4 to check GPT-4 is (mostly) able to follow them. Testing functions are in Python.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
Naomi Bashkansky made most of the system prompts, and Kenneth Li made the rest.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
No.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Limitation: as models become more capable, this benchmark may become outdated/too easy. The ideal benchmark is one that tests the model's alignment - its propensity toward following the system prompt - rather than its ability to do so.
Bias: this datset is only in English, with the exception of three French prompts.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
Forthcoming.
**APA:**
Forthcoming.
## Dataset Card Authors
Naomi Bashkansky, Kenneth Li
## Dataset Card Contact
naomibashkansky@college.harvard.edu, ke_li@g.harvard.edu |
vsanse/hf-codegen | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 32905015
num_examples: 5050
download_size: 18940909
dataset_size: 32905015
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
harinarayan/my_newest_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1558328.0
num_examples: 36
download_size: 1436147
dataset_size: 1558328.0
---
# Dataset Card for "my_newest_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tianduo/gsm8k-split | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: ans
dtype: float64
splits:
- name: train
num_bytes: 3607636
num_examples: 6705
- name: dev
num_bytes: 415350
num_examples: 768
- name: test
num_bytes: 724284
num_examples: 1319
download_size: 2749891
dataset_size: 4747270
---
# Dataset Card for "gsm8k-processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_89_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4516712
num_examples: 10368
download_size: 1529108
dataset_size: 4516712
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_89_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_6 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 7073760
num_examples: 7809
download_size: 1306657
dataset_size: 7073760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
puar-playground/LACE | ---
license: mit
---
|
NovusResearch/OpenHermes-2.5-Translated-TR-sharegpt-style | ---
dataset_info:
features:
- name: custom_instruction
dtype: 'null'
- name: language
dtype: 'null'
- name: idx
dtype: 'null'
- name: source
dtype: string
- name: model_name
dtype: 'null'
- name: skip_prompt_formatting
dtype: bool
- name: category
dtype: string
- name: views
dtype: 'null'
- name: title
dtype: 'null'
- name: topic
dtype: 'null'
- name: id
dtype: 'null'
- name: hash
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: model
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 8364611
num_examples: 5000
download_size: 4674084
dataset_size: 8364611
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DaviAlmeidaDS/pedidos_medicamentos | ---
license: apache-2.0
---
|
dhiruHF/small-occupation-classifier | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 38256
num_examples: 300
download_size: 10732
dataset_size: 38256
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "small-occupation-classifier"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cail2018 | ---
annotations_creators:
- found
language_creators:
- found
language:
- zh
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- other
task_ids: []
paperswithcode_id: chinese-ai-and-law-cail-2018
pretty_name: CAIL 2018
tags:
- judgement-prediction
dataset_info:
features:
- name: fact
dtype: string
- name: relevant_articles
sequence: int32
- name: accusation
sequence: string
- name: punish_of_money
dtype: float32
- name: criminals
sequence: string
- name: death_penalty
dtype: bool
- name: imprisonment
dtype: float32
- name: life_imprisonment
dtype: bool
splits:
- name: exercise_contest_train
num_bytes: 220112348
num_examples: 154592
- name: exercise_contest_valid
num_bytes: 21702109
num_examples: 17131
- name: exercise_contest_test
num_bytes: 41057538
num_examples: 32508
- name: first_stage_train
num_bytes: 1779653382
num_examples: 1710856
- name: first_stage_test
num_bytes: 244334666
num_examples: 217016
- name: final_test
num_bytes: 44194611
num_examples: 35922
download_size: 1167828091
dataset_size: 2351054654
configs:
- config_name: default
data_files:
- split: exercise_contest_train
path: data/exercise_contest_train-*
- split: exercise_contest_valid
path: data/exercise_contest_valid-*
- split: exercise_contest_test
path: data/exercise_contest_test-*
- split: first_stage_train
path: data/first_stage_train-*
- split: first_stage_test
path: data/first_stage_test-*
- split: final_test
path: data/final_test-*
---
---
# Dataset Card for CAIL 2018
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/thunlp/CAIL/blob/master/README_en.md)
- **Repository:** [Github](https://github.com/thunlp/CAIL)
- **Paper:** [Arxiv](https://arxiv.org/abs/1807.02478)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@JetRunner](https://github.com/JetRunner) for adding this dataset. |
Intuit-GenSRF/tweets-hate-speech-detection-es | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: string
- name: processed_text
sequence: string
- name: text_es
dtype: string
splits:
- name: train
num_bytes: 8933354
num_examples: 31962
download_size: 6104746
dataset_size: 8933354
---
# Dataset Card for "tweets_hate_speech_detection-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca | ---
pretty_name: Evaluation run of tlphams/zoyllm-7b-slimorca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tlphams/zoyllm-7b-slimorca](https://huggingface.co/tlphams/zoyllm-7b-slimorca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T20:19:06.813924](https://huggingface.co/datasets/open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca/blob/main/results_2023-12-04T20-19-06.813924.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4870010988384523,\n\
\ \"acc_stderr\": 0.03455949361884823,\n \"acc_norm\": 0.4920391497879656,\n\
\ \"acc_norm_stderr\": 0.03531050289249056,\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.4913166366572656,\n\
\ \"mc2_stderr\": 0.0160517163595852\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4726962457337884,\n \"acc_stderr\": 0.014589589101985993,\n\
\ \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5509858593905597,\n\
\ \"acc_stderr\": 0.004963771168672079,\n \"acc_norm\": 0.7211710814578769,\n\
\ \"acc_norm_stderr\": 0.004475067344626756\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n\
\ \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n\
\ \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.03308530426228257,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.03308530426228257\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6414141414141414,\n \"acc_stderr\": 0.034169036403915214,\n \"\
acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.034169036403915214\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361356,\n\
\ \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361356\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.025049197876042338,\n\
\ \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.025049197876042338\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507382,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507382\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n\
\ \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.671559633027523,\n \"acc_stderr\": 0.02013590279729841,\n \"acc_norm\"\
: 0.671559633027523,\n \"acc_norm_stderr\": 0.02013590279729841\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35648148148148145,\n\
\ \"acc_stderr\": 0.03266478331527272,\n \"acc_norm\": 0.35648148148148145,\n\
\ \"acc_norm_stderr\": 0.03266478331527272\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340703,\n\
\ \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340703\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990407,\n \
\ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990407\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\
\ \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n\
\ \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334384,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334384\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.02934311479809447,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.02934311479809447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.648786717752235,\n\
\ \"acc_stderr\": 0.017069982051499434,\n \"acc_norm\": 0.648786717752235,\n\
\ \"acc_norm_stderr\": 0.017069982051499434\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.02684298551961537,\n\
\ \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.02684298551961537\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095266,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095266\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5337620578778135,\n\
\ \"acc_stderr\": 0.028333277109562793,\n \"acc_norm\": 0.5337620578778135,\n\
\ \"acc_norm_stderr\": 0.028333277109562793\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347666,\n\
\ \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347666\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35984354628422427,\n\
\ \"acc_stderr\": 0.0122582604836898,\n \"acc_norm\": 0.35984354628422427,\n\
\ \"acc_norm_stderr\": 0.0122582604836898\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016633,\n\
\ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45588235294117646,\n \"acc_stderr\": 0.020148939420415738,\n \
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.020148939420415738\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935893,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979035,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979035\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6374269005847953,\n \"acc_stderr\": 0.036871306155620606,\n\
\ \"acc_norm\": 0.6374269005847953,\n \"acc_norm_stderr\": 0.036871306155620606\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.016371836286454604,\n \"mc2\": 0.4913166366572656,\n\
\ \"mc2_stderr\": 0.0160517163595852\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6732438831886346,\n \"acc_stderr\": 0.013181997302131362\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20697498104624715,\n \
\ \"acc_stderr\": 0.011159498164891766\n }\n}\n```"
repo_url: https://huggingface.co/tlphams/zoyllm-7b-slimorca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|arc:challenge|25_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|gsm8k|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hellaswag|10_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T20-19-06.813924.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T20-19-06.813924.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- '**/details_harness|winogrande|5_2023-12-04T20-19-06.813924.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T20-19-06.813924.parquet'
- config_name: results
data_files:
- split: 2023_12_04T20_19_06.813924
path:
- results_2023-12-04T20-19-06.813924.parquet
- split: latest
path:
- results_2023-12-04T20-19-06.813924.parquet
---
# Dataset Card for Evaluation run of tlphams/zoyllm-7b-slimorca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/tlphams/zoyllm-7b-slimorca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [tlphams/zoyllm-7b-slimorca](https://huggingface.co/tlphams/zoyllm-7b-slimorca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T20:19:06.813924](https://huggingface.co/datasets/open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca/blob/main/results_2023-12-04T20-19-06.813924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4870010988384523,
"acc_stderr": 0.03455949361884823,
"acc_norm": 0.4920391497879656,
"acc_norm_stderr": 0.03531050289249056,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.4913166366572656,
"mc2_stderr": 0.0160517163595852
},
"harness|arc:challenge|25": {
"acc": 0.4726962457337884,
"acc_stderr": 0.014589589101985993,
"acc_norm": 0.5059726962457338,
"acc_norm_stderr": 0.014610348300255795
},
"harness|hellaswag|10": {
"acc": 0.5509858593905597,
"acc_stderr": 0.004963771168672079,
"acc_norm": 0.7211710814578769,
"acc_norm_stderr": 0.004475067344626756
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159788,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.03308530426228257,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.03308530426228257
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.034169036403915214,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.034169036403915214
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361356,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361356
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.025049197876042338,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.025049197876042338
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507382,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507382
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.671559633027523,
"acc_stderr": 0.02013590279729841,
"acc_norm": 0.671559633027523,
"acc_norm_stderr": 0.02013590279729841
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.03266478331527272,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.03266478331527272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.030381931949990407,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.030381931949990407
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199984,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.03881891213334384,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.03881891213334384
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809447,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.648786717752235,
"acc_stderr": 0.017069982051499434,
"acc_norm": 0.648786717752235,
"acc_norm_stderr": 0.017069982051499434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.02684298551961537,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.02684298551961537
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095266,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095266
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5337620578778135,
"acc_stderr": 0.028333277109562793,
"acc_norm": 0.5337620578778135,
"acc_norm_stderr": 0.028333277109562793
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347666,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347666
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35984354628422427,
"acc_stderr": 0.0122582604836898,
"acc_norm": 0.35984354628422427,
"acc_norm_stderr": 0.0122582604836898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016633,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.020148939420415738,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.020148939420415738
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979035,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979035
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6374269005847953,
"acc_stderr": 0.036871306155620606,
"acc_norm": 0.6374269005847953,
"acc_norm_stderr": 0.036871306155620606
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.016371836286454604,
"mc2": 0.4913166366572656,
"mc2_stderr": 0.0160517163595852
},
"harness|winogrande|5": {
"acc": 0.6732438831886346,
"acc_stderr": 0.013181997302131362
},
"harness|gsm8k|5": {
"acc": 0.20697498104624715,
"acc_stderr": 0.011159498164891766
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
irds/kilt_codec | ---
pretty_name: '`kilt/codec`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `kilt/codec`
The `kilt/codec` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/kilt#kilt/codec).
# Data
This dataset provides:
- `queries` (i.e., topics); count=42
- `qrels`: (relevance assessments); count=11,323
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/kilt_codec', 'queries')
for record in queries:
record # {'query_id': ..., 'query': ..., 'domain': ..., 'guidelines': ...}
qrels = load_dataset('irds/kilt_codec', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{mackie2022codec,
title={CODEC: Complex Document and Entity Collection},
author={Mackie, Iain and Owoicho, Paul and Gemmell, Carlos and Fischer, Sophie and MacAvaney, Sean and Dalton, Jeffery},
booktitle={Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval},
year={2022}
}
```
|
anuragk16/gretelai-synthetic_text_to_sql-llama2-first-10k | ---
dataset_info:
features:
- name: id
dtype: int32
- name: domain
dtype: string
- name: domain_description
dtype: string
- name: sql_complexity
dtype: string
- name: sql_complexity_description
dtype: string
- name: sql_task_type
dtype: string
- name: sql_task_type_description
dtype: string
- name: sql_prompt
dtype: string
- name: sql_context
dtype: string
- name: sql
dtype: string
- name: sql_explanation
dtype: string
- name: instruction_and_prompt
dtype: string
splits:
- name: train
num_bytes: 15645563
num_examples: 9999
download_size: 5657723
dataset_size: 15645563
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Back-up/qa-temp-v2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: response
struct:
- name: response
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: instruction
dtype: string
- name: prompt_name
dtype: string
splits:
- name: train
num_bytes: 37455
num_examples: 11
download_size: 42609
dataset_size: 37455
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qa-temp-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/clueweb12 | ---
pretty_name: '`clueweb12`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `clueweb12`
The `clueweb12` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/clueweb12#clueweb12).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=733,019,372
This dataset is used by: [`clueweb12_touche-2020-task-2`](https://huggingface.co/datasets/irds/clueweb12_touche-2020-task-2), [`clueweb12_touche-2021-task-2`](https://huggingface.co/datasets/irds/clueweb12_touche-2021-task-2)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/clueweb12', 'docs')
for record in docs:
record # {'doc_id': ..., 'url': ..., 'date': ..., 'http_headers': ..., 'body': ..., 'body_content_type': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
tasksource/blimp_classification | ---
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- text-classification
task_ids:
- acceptability-classification
language:
- en
tags:
- cola
---
Blimp with the coarse categories and recasted as a classification task (Cola format). |
fengtc/GuanacoDataset | ---
license: openrail
---
|
Snoopy04/arc-sv-500 | ---
dataset_info:
features:
- name: question
dtype: string
- name: id
dtype: string
- name: answer
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
splits:
- name: train
num_bytes: 227985.63557858375
num_examples: 658
- name: test
num_bytes: 173241.36442141625
num_examples: 500
download_size: 212445
dataset_size: 401227.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/grenville_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of grenville/グレンヴィル/格伦维尔 (Azur Lane)
This is the dataset of grenville/グレンヴィル/格伦维尔 (Azur Lane), containing 98 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, red_eyes, purple_hair, multicolored_hair, one_side_up, hair_ornament, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 98 | 168.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grenville_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 98 | 90.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grenville_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 256 | 204.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grenville_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 98 | 148.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grenville_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 256 | 305.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grenville_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/grenville_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, solo, cleavage, blush, looking_at_viewer, smile, fingerless_gloves, thighhighs, open_mouth, bare_shoulders |
| 1 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, mosaic_censoring, nipples, open_mouth, penis, fingerless_gloves, solo_focus, spread_legs, thighhighs, breast_grab, pink_hair, pussy, sex, side_ponytail, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | blush | looking_at_viewer | smile | fingerless_gloves | thighhighs | open_mouth | bare_shoulders | 1boy | hetero | mosaic_censoring | nipples | penis | solo_focus | spread_legs | breast_grab | pink_hair | pussy | sex | side_ponytail | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------|:--------------------|:--------|:--------------------|:-------------|:-------------|:-----------------|:-------|:---------|:-------------------|:----------|:--------|:-------------|:--------------|:--------------|:------------|:--------|:------|:----------------|:----------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
anthony-wss/librispeech_asr-audiodec_44k | ---
configs:
- config_name: default
data_files:
- split: train.clean.360
path: data/train.clean.360-*
- split: train.other.500
path: data/train.other.500-*
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: train.clean.360
num_bytes: 10788010668
num_examples: 104014
- name: train.other.500
num_bytes: 14756337873
num_examples: 148688
download_size: 3925792960
dataset_size: 25544348541
---
# Dataset Card for "librispeech_asr-audiodec_44k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/arxiv-clustering-p2p | ---
language:
- en
--- |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-36000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 931954
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlekseyKorshuk/CS1QACensoredClassEval-ultrachat-phi-2-dpo-chatml | ---
dataset_info:
features:
- name: model_input
list:
- name: content
dtype: string
- name: role
dtype: string
- name: baseline_response
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 299296
num_examples: 100
download_size: 110074
dataset_size: 299296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Felladrin/ChatML-distilabel-capybara-dpo-7k-binarized | ---
license: apache-2.0
language:
- en
size_categories:
- 1K<n<10K
task_categories:
- question-answering
- text-generation
---
[argilla/distilabel-capybara-dpo-7k-binarized](https://huggingface.co/datasets/argilla/distilabel-capybara-dpo-7k-binarized) in ChatML format, ready to use in [HuggingFace TRL's DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer).
Python code used for conversion:
```python
from datasets import load_dataset
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Felladrin/Llama-160M-Chat-v1")
dataset = load_dataset("argilla/distilabel-capybara-dpo-7k-binarized", split="train")
def format(columns):
return {
"prompt": tokenizer.apply_chat_template(columns["chosen"][:-1], tokenize=False, add_generation_prompt=True),
"chosen": f"{columns['chosen'][-1]['content']}<|im_end|>",
"rejected": f"{columns['rejected'][-1]['content']}<|im_end|>",
}
dataset.map(format).select_columns(['prompt', 'chosen', 'rejected', 'source', 'rating_chosen', 'rating_rejected', 'chosen_model', 'rejected_model']).to_parquet("train.parquet")
```
|
florianbussmann/FUNSD-vu2020revising | ---
language:
- en
multilinguality:
- monolingual
language_bcp47:
- en-US
---
# Dataset Card for FUNSD-vu2020revising
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Paper:** [https://arxiv.org/abs/2010.05322](https://arxiv.org/abs/2010.05322)
### Dataset Summary
This is the revised version of the [FUNSD dataset](https://huggingface.co/datasets/nielsr/funsd) as proposed by [Vu, H. M., & Nguyen, D. T. N. (2020)](https://arxiv.org/abs/2010.05322).
### Supported Tasks and Leaderboards
The Form Understanding challenge comprises three tasks, namely word grouping, semantic-entity labeling, and entity linking.
## Dataset Structure
### Data Instances
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
The data fields are the same among all splits.
- `id`: a `string` feature - GUID.
- `words`: a `list` of `string` features.
- `bboxes`: a `list` of `list` with four (`int`) features.
- `ner_tags`: a `list` of classification labels (`int`). Full tagset with indices:
```python
{'O': 0, 'B-HEADER': 1, 'I-HEADER': 2, 'B-QUESTION': 3, 'I-QUESTION': 4, 'B-ANSWER': 5, 'I-ANSWER': 6}
```
- `image_path`: a `string` feature.
### Data Splits
| name |train|test|
|------------|----:|---:|
|FUNSD-vu2020| 149| 50|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{vu2020revising,
title={Revising FUNSD dataset for key-value detection in document images},
author={Vu, Hieu M and Nguyen, Diep Thi-Ngoc},
journal={arXiv preprint arXiv:2010.05322},
year={2020}
}
``` |
sravaniayyagari/aeon-latest-json-dataset | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Content
dtype: string
splits:
- name: train
num_bytes: 137965
num_examples: 63
- name: validation
num_bytes: 21223
num_examples: 12
- name: test
num_bytes: 9840
num_examples: 3
download_size: 82259
dataset_size: 169028
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Kukedlc__NeuralExperiment-7b-MagicCoder-v7.5 | ---
pretty_name: Evaluation run of Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5](https://huggingface.co/Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__NeuralExperiment-7b-MagicCoder-v7.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T12:37:54.457110](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralExperiment-7b-MagicCoder-v7.5/blob/main/results_2024-03-07T12-37-54.457110.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6508962736409662,\n\
\ \"acc_stderr\": 0.0320976039888963,\n \"acc_norm\": 0.651277168142893,\n\
\ \"acc_norm_stderr\": 0.03275836492190226,\n \"mc1\": 0.5532435740514076,\n\
\ \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.7211439754946991,\n\
\ \"mc2_stderr\": 0.014513872408727079\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.681740614334471,\n \"acc_stderr\": 0.013611993916971455,\n\
\ \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274777\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6933877713602868,\n\
\ \"acc_stderr\": 0.004601446124041576,\n \"acc_norm\": 0.8794064927305317,\n\
\ \"acc_norm_stderr\": 0.003249887394706504\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.036390575699529276,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.036390575699529276\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"\
acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n\
\ \"acc_stderr\": 0.012758410941038911,\n \"acc_norm\": 0.4784876140808344,\n\
\ \"acc_norm_stderr\": 0.012758410941038911\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5532435740514076,\n\
\ \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.7211439754946991,\n\
\ \"mc2_stderr\": 0.014513872408727079\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.01043091746823743\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6618650492797574,\n \
\ \"acc_stderr\": 0.013030829145172212\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|arc:challenge|25_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|gsm8k|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hellaswag|10_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-37-54.457110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T12-37-54.457110.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- '**/details_harness|winogrande|5_2024-03-07T12-37-54.457110.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T12-37-54.457110.parquet'
- config_name: results
data_files:
- split: 2024_03_07T12_37_54.457110
path:
- results_2024-03-07T12-37-54.457110.parquet
- split: latest
path:
- results_2024-03-07T12-37-54.457110.parquet
---
# Dataset Card for Evaluation run of Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5](https://huggingface.co/Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__NeuralExperiment-7b-MagicCoder-v7.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T12:37:54.457110](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralExperiment-7b-MagicCoder-v7.5/blob/main/results_2024-03-07T12-37-54.457110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6508962736409662,
"acc_stderr": 0.0320976039888963,
"acc_norm": 0.651277168142893,
"acc_norm_stderr": 0.03275836492190226,
"mc1": 0.5532435740514076,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.7211439754946991,
"mc2_stderr": 0.014513872408727079
},
"harness|arc:challenge|25": {
"acc": 0.681740614334471,
"acc_stderr": 0.013611993916971455,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274777
},
"harness|hellaswag|10": {
"acc": 0.6933877713602868,
"acc_stderr": 0.004601446124041576,
"acc_norm": 0.8794064927305317,
"acc_norm_stderr": 0.003249887394706504
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.036390575699529276,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.036390575699529276
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038911,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038911
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5532435740514076,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.7211439754946991,
"mc2_stderr": 0.014513872408727079
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.01043091746823743
},
"harness|gsm8k|5": {
"acc": 0.6618650492797574,
"acc_stderr": 0.013030829145172212
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_NeuralNovel__Panda-7B-v0.1 | ---
pretty_name: Evaluation run of NeuralNovel/Panda-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NeuralNovel/Panda-7B-v0.1](https://huggingface.co/NeuralNovel/Panda-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Panda-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T15:18:35.035620](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Panda-7B-v0.1/blob/main/results_2024-01-04T15-18-35.035620.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6067411577411931,\n\
\ \"acc_stderr\": 0.03324319692041124,\n \"acc_norm\": 0.6115988704639006,\n\
\ \"acc_norm_stderr\": 0.03391766146815033,\n \"mc1\": 0.5214198286413708,\n\
\ \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6697345091207095,\n\
\ \"mc2_stderr\": 0.01518186947277888\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6520613423620792,\n\
\ \"acc_stderr\": 0.004753429806645438,\n \"acc_norm\": 0.8375821549492133,\n\
\ \"acc_norm_stderr\": 0.003680798950531901\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.028901593612411784,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.028901593612411784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520193,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520193\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153317,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153317\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.02508830145469483,\n \
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.02508830145469483\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703643,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703643\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.02308663508684141,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.02308663508684141\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335835,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335835\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.016242028834053616,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.016242028834053616\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545715,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545715\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811943,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811943\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6209150326797386,\n \"acc_stderr\": 0.019627444748412236,\n \
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.019627444748412236\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304324,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304324\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5214198286413708,\n\
\ \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6697345091207095,\n\
\ \"mc2_stderr\": 0.01518186947277888\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3866565579984837,\n \
\ \"acc_stderr\": 0.013413955095965302\n }\n}\n```"
repo_url: https://huggingface.co/NeuralNovel/Panda-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|arc:challenge|25_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|gsm8k|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hellaswag|10_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T15-18-35.035620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T15-18-35.035620.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- '**/details_harness|winogrande|5_2024-01-04T15-18-35.035620.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T15-18-35.035620.parquet'
- config_name: results
data_files:
- split: 2024_01_04T15_18_35.035620
path:
- results_2024-01-04T15-18-35.035620.parquet
- split: latest
path:
- results_2024-01-04T15-18-35.035620.parquet
---
# Dataset Card for Evaluation run of NeuralNovel/Panda-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Panda-7B-v0.1](https://huggingface.co/NeuralNovel/Panda-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Panda-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T15:18:35.035620](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Panda-7B-v0.1/blob/main/results_2024-01-04T15-18-35.035620.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6067411577411931,
"acc_stderr": 0.03324319692041124,
"acc_norm": 0.6115988704639006,
"acc_norm_stderr": 0.03391766146815033,
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6697345091207095,
"mc2_stderr": 0.01518186947277888
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.01435639941800912,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6520613423620792,
"acc_stderr": 0.004753429806645438,
"acc_norm": 0.8375821549492133,
"acc_norm_stderr": 0.003680798950531901
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.028901593612411784,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.028901593612411784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520193,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520193
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153317,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153317
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.02508830145469483,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.02508830145469483
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703643,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703643
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.02308663508684141,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.02308663508684141
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335835,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335835
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.016242028834053616,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.016242028834053616
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811943,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811943
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.019627444748412236,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.019627444748412236
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304324,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304324
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6697345091207095,
"mc2_stderr": 0.01518186947277888
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803152
},
"harness|gsm8k|5": {
"acc": 0.3866565579984837,
"acc_stderr": 0.013413955095965302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kpriyanshu256/semeval-task-8-b | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: text
dtype: string
- name: model
dtype: string
- name: source
dtype: string
- name: label
dtype: int64
- name: id
dtype: int64
splits:
- name: train
num_bytes: 151567991
num_examples: 71027
- name: dev
num_bytes: 4814312
num_examples: 3000
download_size: 84851066
dataset_size: 156382303
---
# Dataset Card for "semeval-task-8-b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bertbsb/hebertespanhol | ---
license: openrail
---
|
open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0 | ---
pretty_name: Evaluation run of RaduGabriel/SirUkrainian2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RaduGabriel/SirUkrainian2.0](https://huggingface.co/RaduGabriel/SirUkrainian2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T14:47:08.297350](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0/blob/main/results_2024-02-16T14-47-08.297350.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6122899617068881,\n\
\ \"acc_stderr\": 0.03314256377542638,\n \"acc_norm\": 0.6163125160011517,\n\
\ \"acc_norm_stderr\": 0.033822587397925895,\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104,\n \"mc2\": 0.6423733209082649,\n\
\ \"mc2_stderr\": 0.01507454376325255\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5989761092150171,\n \"acc_stderr\": 0.014322255790719865,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.650866361282613,\n\
\ \"acc_stderr\": 0.004757220449283699,\n \"acc_norm\": 0.832603067118104,\n\
\ \"acc_norm_stderr\": 0.0037256689970413094\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n\
\ \"acc_stderr\": 0.02556060472102289,\n \"acc_norm\": 0.7193548387096774,\n\
\ \"acc_norm_stderr\": 0.02556060472102289\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693257,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823297,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823297\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4782122905027933,\n\
\ \"acc_stderr\": 0.016706617522176132,\n \"acc_norm\": 0.4782122905027933,\n\
\ \"acc_norm_stderr\": 0.016706617522176132\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.02685882587948855,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.02685882587948855\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001855,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001855\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.01268201633564667,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.01268201633564667\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235403,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235403\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104,\n \"mc2\": 0.6423733209082649,\n\
\ \"mc2_stderr\": 0.01507454376325255\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41015921152388174,\n \
\ \"acc_stderr\": 0.013548335117860338\n }\n}\n```"
repo_url: https://huggingface.co/RaduGabriel/SirUkrainian2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|arc:challenge|25_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|gsm8k|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hellaswag|10_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T14-47-08.297350.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- '**/details_harness|winogrande|5_2024-02-16T14-47-08.297350.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T14-47-08.297350.parquet'
- config_name: results
data_files:
- split: 2024_02_16T14_47_08.297350
path:
- results_2024-02-16T14-47-08.297350.parquet
- split: latest
path:
- results_2024-02-16T14-47-08.297350.parquet
---
# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian2.0](https://huggingface.co/RaduGabriel/SirUkrainian2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:47:08.297350](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian2.0/blob/main/results_2024-02-16T14-47-08.297350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6122899617068881,
"acc_stderr": 0.03314256377542638,
"acc_norm": 0.6163125160011517,
"acc_norm_stderr": 0.033822587397925895,
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104,
"mc2": 0.6423733209082649,
"mc2_stderr": 0.01507454376325255
},
"harness|arc:challenge|25": {
"acc": 0.5989761092150171,
"acc_stderr": 0.014322255790719865,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.650866361282613,
"acc_stderr": 0.004757220449283699,
"acc_norm": 0.832603067118104,
"acc_norm_stderr": 0.0037256689970413094
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102289,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102289
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.02486499515976775,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.02486499515976775
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693257,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823297,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4782122905027933,
"acc_stderr": 0.016706617522176132,
"acc_norm": 0.4782122905027933,
"acc_norm_stderr": 0.016706617522176132
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948855,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948855
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001855,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001855
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.01268201633564667,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.01268201633564667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.01969145905235403,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.01969145905235403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104,
"mc2": 0.6423733209082649,
"mc2_stderr": 0.01507454376325255
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.41015921152388174,
"acc_stderr": 0.013548335117860338
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CATIE-AQ/newsquadfr_fr_prompt_question_generation_with_answer_and_context | ---
language:
- fr
license: cc-by-nc-sa-4.0
size_categories:
- 10K<n<100K
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- newsquadfr
---
# newsquadfr_fr_prompt_question_generation_with_answer_and_context
## Summary
**newsquadfr_fr_prompt_question_generation_with_answer_and_context** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **88,410** rows that can be used for a question generation (with answer and context) task.
The original data (without prompts) comes from the dataset [newsquadfr](https://huggingface.co/datasets/lincoln/newsquadfr) and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
21 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Déterminer la question qui aurait pu être posée pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Détermine la question que tu aurais pu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Déterminez la question que vous auriez pu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question aurait pu être posée pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question aurais-tu pu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question auriez-vous pu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question peut être posée pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question peux-tu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question pouvez-vous poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Sachant la réponse suivante : "'+answer+'"\n Générer une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Génère une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Générez une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Trouver une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Trouves une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Trouvez une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Créer une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Crée trouver une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Créez trouver une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Ecrire une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Ecris une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Ecrivez une bonne question pour le texte suivant : "'+context+'"
```
# Splits
- `train` with 69,300 samples
- `valid` with 19,100 samples
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/newsquadfr_fr_prompt_question_generation_with_answer_and_context")
```
# Citation
## Original data
> Hugging Face repository: https://huggingface.co/datasets/lincoln/newsquadfr
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
CC BY-NC-SA 4.0 |
arielnlee/Superimposed-Masked-Dataset | ---
license: other
task_categories:
- image-classification
language:
- en
tags:
- occlusion
size_categories:
- 10K<n<100K
---
# Superimposed Masked Dataset (SMD)
SMD is an occluded version of the ImageNet-1K validation set, created to serve as an additional way to evaluate the impact of occlusion on model performance. Occluder objects were segmented using Meta's Segment Anything and are not in the ImageNet-1K label space. They were chosen to be unambiguous in relationship to objects that reside in the label space. Additional details about the dataset, including code to generate your own version of SMD, actual occlusion percentage of each image in the dataset, as well as occluder object segmentation masks, will be released shortly.

The occluders shown above from left to right, starting from the top row: <strong>Grogu (baby yoda), bacteria, bacteriophage, airpods, origami heart, drone, diamonds (stones, not setting) and coronavirus</strong>. Occluder object images were obtained through Unsplash.
SMD was created for testing model robustness to occlusion in [Hardwiring ViT Patch Selectivity into CNNs using Patch Mixing](https://arielnlee.github.io/PatchMixing/).
## Citations
```bibtex
@misc{lee2023hardwiring,
title={Hardwiring ViT Patch Selectivity into CNNs using Patch Mixing},
author={Ariel N. Lee and Sarah Adel Bargal and Janavi Kasera and Stan Sclaroff and Kate Saenko and Nataniel Ruiz},
year={2023},
eprint={2306.17848},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```bibtex
@article{imagenet15russakovsky,
Author = {Olga Russakovsky and Jia Deng and Hao Su and Jonathan Krause and Sanjeev Satheesh and Sean Ma and Zhiheng Huang and Andrej Karpathy and Aditya Khosla and Michael Bernstein and Alexander C. Berg and Li Fei-Fei},
Title = { {ImageNet Large Scale Visual Recognition Challenge} },
Year = {2015},
journal = {International Journal of Computer Vision (IJCV)},
doi = {10.1007/s11263-015-0816-y},
volume={115},
number={3},
pages={211-252}
}
``` |
openhuman/openhuman | ---
license: mit
---
|
agicorp/Text-to-sql-v1 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- SQL
size_categories:
- 100K<n<1M
--- |
joey234/mmlu-professional_medicine-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 218349
num_examples: 272
download_size: 124604
dataset_size: 218349
---
# Dataset Card for "mmlu-professional_medicine-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arpitsh018/apt-micro-dataset-llm-v2-714k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: int64
- name: source
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 1753434111.3731575
num_examples: 714801
- name: validation
num_bytes: 490607.6268424799
num_examples: 200
download_size: 911152910
dataset_size: 1753924719.0
---
# Dataset Card for "apt-micro-dataset-llm-v2-714k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
victorialee/openai_summarize_comparisons_relabel_GPTJ | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: test
num_bytes: 143018505
num_examples: 83629
relabeled_number: 27440
relabeled_percentage: 0.32811584498200386
- name: train
num_bytes: 157425966
num_examples: 92534
relabeled_number: 18447
relabeled_percentage: 0.19935375105366676
- name: valid1
num_bytes: 56686271
num_examples: 33082
- name: valid2
num_bytes: 86396487
num_examples: 50715
download_size: 20257716
dataset_size: 443527229
---
|
gantertfeldt/fullHD | ---
license: unlicense
---
|
Trelis/tiny-shakespeare | ---
task_categories:
- text-generation
language:
- en
tags:
- fine-tuning
- shakespeare
size_categories:
- n<1K
---
# Data source
Downloaded via Andrej Karpathy's nanogpt repo from this [link](https://raw.githubusercontent.com/karpathy/char-rnn/master/data/tinyshakespeare/input.txt)
# Data Format
- The entire dataset is split into train (90%) and test (10%).
- All rows are at most 1024 tokens, using the Llama 2 tokenizer.
- All rows are split cleanly so that sentences are whole and unbroken. |
neerajnarwal/Command_Generation | ---
language:
- en
license: apache-2.0
---
|
InceptiveDev/CovetLetterDataset | ---
license: mit
---
|
s3prl/pre-releases | ---
license: apache-2.0
---
|
tyzhu/find_marker_both_sent_train_200_eval_40 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1490922
num_examples: 1263
- name: validation
num_bytes: 223740
num_examples: 203
download_size: 351569
dataset_size: 1714662
---
# Dataset Card for "find_marker_both_sent_train_200_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tiat_sukasuka | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tiat Siba Igleo/ティアット・シバ・イグナレオ (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?)
This is the dataset of Tiat Siba Igleo/ティアット・シバ・イグナレオ (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?), containing 152 images and their tags.
The core tags of this character are `green_hair, green_eyes, short_hair, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 152 | 100.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiat_sukasuka/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 152 | 100.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiat_sukasuka/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 250 | 166.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiat_sukasuka/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tiat_sukasuka',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, upper_body, collarbone, short_hair_with_long_locks, solo, green_shirt, :o, looking_at_viewer, open_mouth, short_over_long_sleeves, brick_wall, vest, blurry_background, parody |
| 1 | 5 |  |  |  |  |  | 1girl, anime_coloring, day, outdoors, solo, sweatdrop, teeth, upper_body, collarbone, open_mouth, short_over_long_sleeves, sky, clenched_hand, jewelry, tree, parody, white_shirt |
| 2 | 6 |  |  |  |  |  | 1girl, open_mouth, :d, blush, anime_coloring, solo, day |
| 3 | 8 |  |  |  |  |  | 1boy, short_hair_with_long_locks, smile, solo_focus, 1girl, holding_hands, out_of_frame, green_dress, long_sleeves, aged_down, pantyhose, boots |
| 4 | 7 |  |  |  |  |  | 1girl, sailor_collar, collarbone, open_mouth, school_uniform, shirt, empty_eyes, looking_at_viewer, portrait, short_hair_with_long_locks, :o, solo_focus, parted_lips |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | upper_body | collarbone | short_hair_with_long_locks | solo | green_shirt | :o | looking_at_viewer | open_mouth | short_over_long_sleeves | brick_wall | vest | blurry_background | parody | anime_coloring | day | outdoors | sweatdrop | teeth | sky | clenched_hand | jewelry | tree | white_shirt | :d | blush | 1boy | smile | solo_focus | holding_hands | out_of_frame | green_dress | long_sleeves | aged_down | pantyhose | boots | sailor_collar | school_uniform | shirt | empty_eyes | portrait | parted_lips |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-------------|:-----------------------------|:-------|:--------------|:-----|:--------------------|:-------------|:--------------------------|:-------------|:-------|:--------------------|:---------|:-----------------|:------|:-----------|:------------|:--------|:------|:----------------|:----------|:-------|:--------------|:-----|:--------|:-------|:--------|:-------------|:----------------|:---------------|:--------------|:---------------|:------------|:------------|:--------|:----------------|:-----------------|:--------|:-------------|:-----------|:--------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | X | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | | X | | | | X | | | | | | X | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X |
|
showchen/zero | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.