datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
erikaxenia/id_card_class_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 151698325.0
num_examples: 529
- name: test
num_bytes: 16112586.0
num_examples: 58
download_size: 165957552
dataset_size: 167810911.0
---
# Dataset Card for "id_card_class_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NativeFunction/taxi-fare-sample | ---
dataset_info:
features:
- name: key
dtype: string
- name: fare_amount
dtype: float64
splits:
- name: train
num_bytes: 392825
num_examples: 9914
download_size: 100990
dataset_size: 392825
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlFrauch/step_by_step_diff_in_latex | ---
dataset_info:
features:
- name: function
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 23674753
num_examples: 6031
download_size: 3265633
dataset_size: 23674753
---
# Dataset Card for "step_by_step_diff_in_latex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
louisbrulenaudet/code-action-sociale-familles | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code de l'action sociale et des familles
source_datasets:
- original
pretty_name: Code de l'action sociale et des familles
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code de l'action sociale et des familles, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
ashraq/tmdb-people-image | ---
dataset_info:
features:
- name: adult
dtype: bool
- name: also_known_as
dtype: string
- name: biography
dtype: string
- name: birthday
dtype: string
- name: deathday
dtype: string
- name: gender
dtype: int64
- name: homepage
dtype: string
- name: id
dtype: int64
- name: imdb_id
dtype: string
- name: known_for_department
dtype: string
- name: name
dtype: string
- name: place_of_birth
dtype: string
- name: popularity
dtype: float64
- name: profile_path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 3749610460.6819267
num_examples: 116403
download_size: 3733145768
dataset_size: 3749610460.6819267
---
Data was obtained from [TMDB API](https://developers.themoviedb.org/3) |
mHossain/final_train_v4_test_100000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 5754300.3
num_examples: 18000
- name: test
num_bytes: 639366.7
num_examples: 2000
download_size: 2776539
dataset_size: 6393667.0
---
# Dataset Card for "final_train_v4_test_100000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LabHC/bias_in_bios | ---
license: mit
task_categories:
- text-classification
language:
- en
dataset_info:
features:
- name: hard_text
dtype: string
- name: profession
dtype: int64
- name: gender
dtype: int64
splits:
- name: train
num_bytes: 107487885
num_examples: 257478
- name: test
num_bytes: 41312256
num_examples: 99069
- name: dev
num_bytes: 16504417
num_examples: 39642
download_size: 99808338
dataset_size: 165304558
---
# Bias in Bios
Bias in Bios was created by (De-Artega et al., 2019) and published under the MIT license (https://github.com/microsoft/biosbias). The dataset is used to investigate bias in NLP models. It consists of textual biographies used to predict professional occupations, the sensitive attribute is the gender (binary).
The version shared here is the version proposed by (Ravgofel et al., 2020) which slightly smaller due to the unavailability of 5,557 biographies.
The dataset is divided between train (257,000 samples), test (99,000 samples) and dev (40,000 samples) sets.
To load each all splits ('train', 'dev', 'test'), use the following code :
```python
train_dataset = load_dataset("LabHC/bias_in_bios", split='train')
test_dataset = load_dataset("LabHC/bias_in_bios", split='test')
dev_dataset = load_dataset("LabHC/bias_in_bios", split='dev')
```
Below are presented the classifiaction and sensitive attribtues labels and their proportion. Distributions are similar through the three sets.
#### Classification labels
| Profession | Numerical label | Proportion (%)| | Profession | Numerical label | Proportion (%)|
|---|---|---|---|---|---|---|
accountant | 0 | 1.42 | | nurse | 13 | 4.78
architect | 1 | 2.55 | | painter | 14 | 1.95
attorney | 2 | 8.22 | | paralegal | 15 | 0.45
chiropractor | 3 | 0.67 | | pastor | 16 | 0.64
comedian | 4 | 0.71 | | personal_trainer | 17 | 0.36
composer | 5 | 1.41 | | photographer | 18 | 6.13
dentist | 6 | 3.68 | | physician | 19 | 10.35
dietitian | 7 | 1.0 | | poet | 20 | 1.77
dj | 8 | 0.38 | | professor | 21 | 29.8
filmmaker | 9 | 1.77 | | psychologist | 22 | 4.64
interior_designer | 10 | 0.37 | | rapper | 23 | 0.35
journalist | 11 | 5.03 | | software_engineer | 24 | 1.74
model | 12 | 1.89 | | surgeon | 25 | 3.43
nurse | 13 | 4.78 | | teacher | 26 | 4.09
painter | 14 | 1.95 | | yoga_teacher | 27 | 0.42
#### Sensitive attributes
| Gender | Numerical label | Proportion (%)|
|---|---|---|
Male | 0 | 53.9 |
Female | 1 | 46.1
---
(De-Artega et al., 2019) Maria De-Arteaga, Alexey Romanov, Hanna Wallach, Jennifer Chayes, Christian Borgs, Alexandra Chouldechova, Sahin Geyik, Krishnaram Kenthapadi, and Adam Tauman Kalai. 2019. Bias in Bios: A Case Study of Semantic Representation Bias in a High-Stakes Setting. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT* '19). Association for Computing Machinery, New York, NY, USA, 120–128. https://doi.org/10.1145/3287560.3287572
(Ravgofel et al., 2020) Shauli Ravfogel, Yanai Elazar, Hila Gonen, Michael Twiton, and Yoav Goldberg. 2020. Null It Out: Guarding Protected Attributes by Iterative Nullspace Projection. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7237–7256, Online. Association for Computational Linguistics. |
HuggingFaceM4/MMBench_modif_chatbot_deprecated | Invalid username or password. |
SeyedAli/Persian-Text-QA | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: int32
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 18042405
num_examples: 9008
- name: validation
num_bytes: 1863345
num_examples: 930
download_size: 1788848
dataset_size: 19905750
---
|
joaoab8624/caio | ---
license: openrail
---
|
open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-5 | ---
pretty_name: Evaluation run of juhwanlee/llmdo-Mistral-7B-case-5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [juhwanlee/llmdo-Mistral-7B-case-5](https://huggingface.co/juhwanlee/llmdo-Mistral-7B-case-5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T12:56:01.569270](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-5/blob/main/results_2024-03-07T12-56-01.569270.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6329322606875549,\n\
\ \"acc_stderr\": 0.032440672640608986,\n \"acc_norm\": 0.6383057666928241,\n\
\ \"acc_norm_stderr\": 0.03309847079092893,\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.45463803907018246,\n\
\ \"mc2_stderr\": 0.014676282460911799\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449703,\n\
\ \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.014169664520303098\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.63433578968333,\n \
\ \"acc_stderr\": 0.004806316342709399,\n \"acc_norm\": 0.8339972117108145,\n\
\ \"acc_norm_stderr\": 0.0037132270642253893\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337152,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337152\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n\
\ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n\
\ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.02777253333421896,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.02777253333421896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431374,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431374\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.030047357655806635,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.030047357655806635\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.358659217877095,\n\
\ \"acc_stderr\": 0.01604045442616447,\n \"acc_norm\": 0.358659217877095,\n\
\ \"acc_norm_stderr\": 0.01604045442616447\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724556,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724556\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.45463803907018246,\n\
\ \"mc2_stderr\": 0.014676282460911799\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235803\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38514025777103866,\n \
\ \"acc_stderr\": 0.013404165536474303\n }\n}\n```"
repo_url: https://huggingface.co/juhwanlee/llmdo-Mistral-7B-case-5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|arc:challenge|25_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|gsm8k|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hellaswag|10_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-56-01.569270.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T12-56-01.569270.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- '**/details_harness|winogrande|5_2024-03-07T12-56-01.569270.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T12-56-01.569270.parquet'
- config_name: results
data_files:
- split: 2024_03_07T12_56_01.569270
path:
- results_2024-03-07T12-56-01.569270.parquet
- split: latest
path:
- results_2024-03-07T12-56-01.569270.parquet
---
# Dataset Card for Evaluation run of juhwanlee/llmdo-Mistral-7B-case-5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [juhwanlee/llmdo-Mistral-7B-case-5](https://huggingface.co/juhwanlee/llmdo-Mistral-7B-case-5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T12:56:01.569270](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__llmdo-Mistral-7B-case-5/blob/main/results_2024-03-07T12-56-01.569270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6329322606875549,
"acc_stderr": 0.032440672640608986,
"acc_norm": 0.6383057666928241,
"acc_norm_stderr": 0.03309847079092893,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.45463803907018246,
"mc2_stderr": 0.014676282460911799
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449703,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.014169664520303098
},
"harness|hellaswag|10": {
"acc": 0.63433578968333,
"acc_stderr": 0.004806316342709399,
"acc_norm": 0.8339972117108145,
"acc_norm_stderr": 0.0037132270642253893
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337152,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337152
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.02777253333421896,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.02777253333421896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431374,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431374
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069425,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069425
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.030047357655806635,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.030047357655806635
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.358659217877095,
"acc_stderr": 0.01604045442616447,
"acc_norm": 0.358659217877095,
"acc_norm_stderr": 0.01604045442616447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724556,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724556
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.45463803907018246,
"mc2_stderr": 0.014676282460911799
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235803
},
"harness|gsm8k|5": {
"acc": 0.38514025777103866,
"acc_stderr": 0.013404165536474303
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-phpthinh__examplei-all-929d48-1748861030 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/examplei
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b1
metrics: ['f1']
dataset_name: phpthinh/examplei
dataset_config: all
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b1
* Dataset: phpthinh/examplei
* Config: all
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
open-llm-leaderboard/details_ChaoticNeutrals__Eris_Floramix_DPO_7B | ---
pretty_name: Evaluation run of ChaoticNeutrals/Eris_Floramix_DPO_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChaoticNeutrals/Eris_Floramix_DPO_7B](https://huggingface.co/ChaoticNeutrals/Eris_Floramix_DPO_7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChaoticNeutrals__Eris_Floramix_DPO_7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T17:56:53.440924](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Eris_Floramix_DPO_7B/blob/main/results_2024-03-09T17-56-53.440924.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6529783366883435,\n\
\ \"acc_stderr\": 0.032185941214831225,\n \"acc_norm\": 0.6524698643693841,\n\
\ \"acc_norm_stderr\": 0.03285997806002365,\n \"mc1\": 0.5507955936352509,\n\
\ \"mc1_stderr\": 0.01741294198611529,\n \"mc2\": 0.709449841928242,\n\
\ \"mc2_stderr\": 0.014874928360593245\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.013273077865907592,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869142\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7176857199761004,\n\
\ \"acc_stderr\": 0.00449205527940711,\n \"acc_norm\": 0.8827922724556861,\n\
\ \"acc_norm_stderr\": 0.003210102507177252\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579823,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579823\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n\
\ \"acc_stderr\": 0.01648278218750067,\n \"acc_norm\": 0.41564245810055866,\n\
\ \"acc_norm_stderr\": 0.01648278218750067\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"\
acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083133,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5507955936352509,\n\
\ \"mc1_stderr\": 0.01741294198611529,\n \"mc2\": 0.709449841928242,\n\
\ \"mc2_stderr\": 0.014874928360593245\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272969\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6755117513267627,\n \
\ \"acc_stderr\": 0.012896095359768111\n }\n}\n```"
repo_url: https://huggingface.co/ChaoticNeutrals/Eris_Floramix_DPO_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|arc:challenge|25_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|gsm8k|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hellaswag|10_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T17-56-53.440924.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T17-56-53.440924.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- '**/details_harness|winogrande|5_2024-03-09T17-56-53.440924.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T17-56-53.440924.parquet'
- config_name: results
data_files:
- split: 2024_03_09T17_56_53.440924
path:
- results_2024-03-09T17-56-53.440924.parquet
- split: latest
path:
- results_2024-03-09T17-56-53.440924.parquet
---
# Dataset Card for Evaluation run of ChaoticNeutrals/Eris_Floramix_DPO_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChaoticNeutrals/Eris_Floramix_DPO_7B](https://huggingface.co/ChaoticNeutrals/Eris_Floramix_DPO_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChaoticNeutrals__Eris_Floramix_DPO_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T17:56:53.440924](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Eris_Floramix_DPO_7B/blob/main/results_2024-03-09T17-56-53.440924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6529783366883435,
"acc_stderr": 0.032185941214831225,
"acc_norm": 0.6524698643693841,
"acc_norm_stderr": 0.03285997806002365,
"mc1": 0.5507955936352509,
"mc1_stderr": 0.01741294198611529,
"mc2": 0.709449841928242,
"mc2_stderr": 0.014874928360593245
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.013273077865907592,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869142
},
"harness|hellaswag|10": {
"acc": 0.7176857199761004,
"acc_stderr": 0.00449205527940711,
"acc_norm": 0.8827922724556861,
"acc_norm_stderr": 0.003210102507177252
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754406,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579823,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579823
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.01648278218750067,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.01648278218750067
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083133,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5507955936352509,
"mc1_stderr": 0.01741294198611529,
"mc2": 0.709449841928242,
"mc2_stderr": 0.014874928360593245
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272969
},
"harness|gsm8k|5": {
"acc": 0.6755117513267627,
"acc_stderr": 0.012896095359768111
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ihaflix1/vozdocumentario | ---
license: openrail
---
|
DynamicSuperbPrivate/SpeechTextMatching_LibrispeechTrainClean360 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: text
dtype: string
- name: instruction
dtype: string
- name: label
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 24960872147.768
num_examples: 104014
- name: validation
num_bytes: 348628035.844
num_examples: 2703
download_size: 23576168585
dataset_size: 25309500183.612003
---
# Dataset Card for "speechTextMatching_LibrispeechTrainClean360"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shuyuej/MetaMathQA | ---
license: apache-2.0
---
# MetaMath Dataset with "{"question": question, "answer": answer}" pairs.
# 💻 Dataset Usage
Run the following command to load the data:
```python
from datasets import load_dataset
dataset = load_dataset("shuyuej/MetaMathQA")
dataset = dataset['train']
print(dataset)
```
# 📝 Dataset modification codes
```python
# coding=utf-8
import jsonlines
from datasets import load_dataset
# Load the dataset
dataset = load_dataset("meta-math/MetaMathQA")
dataset = dataset["train"]
data = []
for example in dataset:
question = example['query']
answer = example['response']
data.append({"question": question, "answer": answer})
# Save the modified data to a jsonl file
output_file = 'MetaMathQA.jsonl'
with jsonlines.open(output_file, 'w') as writer:
writer.write_all(data)
print(f"Modified data saved to {output_file}")
```
|
open-llm-leaderboard/details_Sao10K__Euryale-L2-70B | ---
pretty_name: Evaluation run of Sao10K/Euryale-L2-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Euryale-L2-70B](https://huggingface.co/Sao10K/Euryale-L2-70B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Euryale-L2-70B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T17:20:57.246937](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Euryale-L2-70B/blob/main/results_2023-10-29T17-20-57.246937.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n\
\ \"em_stderr\": 0.0005023380498893235,\n \"f1\": 0.06751782718120815,\n\
\ \"f1_stderr\": 0.0013937914519446145,\n \"acc\": 0.5430945808722376,\n\
\ \"acc_stderr\": 0.011469812310058832\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893235,\n\
\ \"f1\": 0.06751782718120815,\n \"f1_stderr\": 0.0013937914519446145\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.265352539802881,\n \
\ \"acc_stderr\": 0.012161675464069675\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Euryale-L2-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|arc:challenge|25_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T17_20_57.246937
path:
- '**/details_harness|drop|3_2023-10-29T17-20-57.246937.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T17-20-57.246937.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T17_20_57.246937
path:
- '**/details_harness|gsm8k|5_2023-10-29T17-20-57.246937.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T17-20-57.246937.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hellaswag|10_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T17_20_57.246937
path:
- '**/details_harness|winogrande|5_2023-10-29T17-20-57.246937.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T17-20-57.246937.parquet'
- config_name: results
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- results_2023-09-19T00-30-23.278534.parquet
- split: 2023_10_29T17_20_57.246937
path:
- results_2023-10-29T17-20-57.246937.parquet
- split: latest
path:
- results_2023-10-29T17-20-57.246937.parquet
---
# Dataset Card for Evaluation run of Sao10K/Euryale-L2-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Euryale-L2-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Euryale-L2-70B](https://huggingface.co/Sao10K/Euryale-L2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Euryale-L2-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T17:20:57.246937](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Euryale-L2-70B/blob/main/results_2023-10-29T17-20-57.246937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893235,
"f1": 0.06751782718120815,
"f1_stderr": 0.0013937914519446145,
"acc": 0.5430945808722376,
"acc_stderr": 0.011469812310058832
},
"harness|drop|3": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893235,
"f1": 0.06751782718120815,
"f1_stderr": 0.0013937914519446145
},
"harness|gsm8k|5": {
"acc": 0.265352539802881,
"acc_stderr": 0.012161675464069675
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047987
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
didsr/msynth | ---
license: cc0-1.0
task_categories:
- image-classification
- image-segmentation
tags:
- medical
pretty_name: M-SYNTH
size_categories:
- 10K<n<100K
---
# M-SYNTH
<!-- Provide a quick summary of the dataset. -->
M-SYNTH is a synthetic digital mammography (DM) dataset with four breast fibroglandular density distributions imaged using Monte Carlo x-ray simulations with the publicly available [Virtual Imaging Clinical Trial for Regulatory Evaluation (VICTRE)](https://github.com/DIDSR/VICTRE) toolkit.
## Dataset Details
The dataset has the following characteristics:
* Breast density: dense, heterogeneously dense, scattered, fatty
* Mass radius (mm): 5.00, 7.00, 9.00
* Mass density: 1.0, 1.06, 1.1 (ratio of radiodensity of the mass to that of fibroglandular tissue)
* Relative dose: 20%, 40%, 60%, 80%, 100% of the clinically recommended dose for each density
<p align="center">
<img src='https://raw.githubusercontent.com/DIDSR/msynth-release/main/images/examples.png' width='700'>
</p>
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [Elena Sizikova](https://esizikova.github.io/), [Niloufar Saharkhiz](https://www.linkedin.com/in/niloufar-saharkhiz/), [Diksha Sharma](https://www.linkedin.com/in/diksha-sharma-6059977/), [Miguel Lago](https://www.linkedin.com/in/milaan/), [Berkman Sahiner](https://www.linkedin.com/in/berkman-sahiner-6aa9a919/), [Jana Gut Delfino](https://www.linkedin.com/in/janadelfino/), [Aldo Badano](https://www.linkedin.com/in/aldobadano/)
- **License:** Creative Commons 1.0 Universal License (CC0)
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Code:** [https://github.com/DIDSR/msynth-release](https://github.com/DIDSR/msynth-release)
- **Paper:** [https://arxiv.org/pdf/2310.18494.pdf](https://arxiv.org/pdf/2310.18494.pdf)
- **Demo:** [https://github.com/DIDSR/msynth-release/tree/master/examples](https://github.com/DIDSR/msynth-release/tree/master/examples)
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
M-SYNTH is intended to facilitate testing of AI with pre-computed synthetic mammography data.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
M-SYNTH can be used to evaluate the effect of mass size and density, breast density, and dose on AI performance in lesion detection.
M-SYNTH can be used to either train or test pre-trained AI models.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
M-SYNTH cannot be used in lieu of real patient examples to make performance determinations.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
M-SYNTH is organized into a directory structure that indicates the parameters. The folder
```
device_data_VICTREPhantoms_spic_[LESION_DENSITY]/[DOSE]/[BREAST_DENSITY]/2/[LESION_SIZE]/SIM/P2_[LESION_SIZE]_[BREAST_DENSITY].8337609.[PHANTOM_FILE_ID]/[PHANTOM_FILEID]/
```
contains image files imaged with the specified parameters. Note that only examples with odd PHANTOM_FILEID contain lesions, others do not.
```
$ tree data/device_data_VICTREPhantoms_spic_1.0/1.02e10/hetero/2/5.0/SIM/P2_5.0_hetero.8337609.1/1/
data/device_data_VICTREPhantoms_spic_1.0/1.02e10/hetero/2/5.0/SIM/P2_5.0_hetero.8337609.1/1/
├── DICOM_dm
│ └── 000.dcm
├── projection_DM1.loc
├── projection_DM1.mhd
└── projection_DM1.raw
```
Each folder contains mammogram data that can be read from .raw format (.mhd contains supporting data), or DICOM (.dcm) format.
Coordinates of lesions can be found in .loc files. Segmentations are stored in .raw format and can be found in data/segmentation_masks/* .
See [Github](https://github.com/DIDSR/msynth-release/tree/main/code) for examples of how to access the files, and [examples](https://github.com/DIDSR/msynth-release/tree/main/examples) for code to load each type of file.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Simulation-based testing is constrained to the parameter variability represented in the object model and the acquisition system.
There is a risk of misjudging model performance if the simulated examples do not capture the variability in real patients. Please
see the paper for a full discussion of biases, risks, and limitations.
## How to use it
The msynth dataset is a very large dataset so for most use cases it is recommended to make use of the streaming API of `datasets`.
The msynth dataset has three configurations: 1) device_data, 2) segmentation_mask, and 3) metadata
You can load and iterate through the dataset using the configurations with the following lines of code:
```python
from datasets import load_dataset
ds = load_dataset("didsr/msynth", 'device_data') # For device data for all breast density, mass redius, mass density, and relative dose, change configuration to 'segmentation_mask' and 'metadata' to load the segmentation masks and bound information
print(ds_data["device_data"])
# A sample data instance
{'Raw': '~\\.cache\\huggingface\\datasets\\downloads\\extracted\\59384cf05fc44e8c0cb23bb19e1fcd8f0c39720b282109d204a85561fe66bdb1\\SIM\\P2_5.0_fatty.8336179.1\\1\\projection_DM1.raw',
'mhd': '~/.cache/huggingface/datasets/downloads/extracted/59384cf05fc44e8c0cb23bb19e1fcd8f0c39720b282109d204a85561fe66bdb1/SIM/P2_5.0_fatty.8336179.1/1\\projection_DM1.mhd',
'loc': '~/.cache/huggingface/datasets/downloads/extracted/59384cf05fc44e8c0cb23bb19e1fcd8f0c39720b282109d204a85561fe66bdb1/SIM/P2_5.0_fatty.8336179.1/1\\projection_DM1.loc',
'dcm': '~/.cache/huggingface/datasets/downloads/extracted/59384cf05fc44e8c0cb23bb19e1fcd8f0c39720b282109d204a85561fe66bdb1/SIM/P2_5.0_fatty.8336179.1/1\\DICOM_dm\\000.dcm',
'density': 'fatty',
'mass_radius': 5.0}
```
Msynth dataset can also be loaded using custom breast density, mass redius, mass density, and relative dose information
```python
from datasets import load_dataset
# Dataset properties. change to 'all' to include all the values of breast density, mass redius, mass density, and relative dose information
config_kwargs = {
"lesion_density": ["1.0"],
"dose": ["20%"],
"density": ["fatty"],
"size": ["5.0"]
}
# Loading device data
ds_data = load_dataset("didsr/msynth", 'device_data', **config_kwargs)
# Loading segmentation-mask
ds_seg = load_dataset("didsr/msynth", 'segmentation_mask', **config_kwargs)
```
The meta data can also be loaded using the datasets API. An example of using metadata is given in **Demo:** [https://github.com/DIDSR/msynth-release/tree/master/examples](https://github.com/DIDSR/msynth-release/tree/master/examples)
```python
from datasets import load_dataset
# Loading metadata
ds_meta = load_dataset("didsr/msynth", 'metadata')
# A sample data instance
ds_meta['metadata'][0]
# Output
{'fatty': '~\\.cache\\huggingface\\datasets\\downloads\\extracted\\3ea85fc6b3fcc253ac8550b5d1b21db406ca9a59ea125ff8fc63d9b754c88348\\bounds\\bounds_fatty.npy',
'dense': '~\\.cache\\huggingface\\datasets\\downloads\\extracted\\3ea85fc6b3fcc253ac8550b5d1b21db406ca9a59ea125ff8fc63d9b754c88348\\bounds\\bounds_dense.npy',
'hetero': '~\\.cache\\huggingface\\datasets\\downloads\\extracted\\3ea85fc6b3fcc253ac8550b5d1b21db406ca9a59ea125ff8fc63d9b754c88348\\bounds\\bounds_hetero.npy',
'scattered': '~\\.cache\\huggingface\\datasets\\downloads\\extracted\\3ea85fc6b3fcc253ac8550b5d1b21db406ca9a59ea125ff8fc63d9b754c88348\\bounds\\bounds_scattered.npy'}
```
## Citation
```
@article{sizikova2023knowledge,
title={Knowledge-based in silico models and dataset for the comparative evaluation of mammography AI for a range of breast characteristics, lesion conspicuities and doses},
author={Sizikova, Elena and Saharkhiz, Niloufar and Sharma, Diksha and Lago, Miguel and Sahiner, Berkman and Delfino, Jana G. and Badano, Aldo},
journal={Advances in Neural Information Processing Systems},
volume={},
pages={},
year={2023}
}
```
## Related Links
1. [Virtual Imaging Clinical Trial for Regulatory Evaluation (VICTRE)](https://www.fda.gov/medical-devices/science-and-research-medical-devices/victre-silico-breast-imaging-pipeline).
2. [FDA Catalog of Regulatory Science Tools to Help Assess New Medical Devices](https://www.fda.gov/medical-devices/science-and-research-medical-devices/catalog-regulatory-science-tools-help-assess-new-medical-devices).
3. A. Badano, C. G. Graff, A. Badal, D. Sharma, R. Zeng, F. W. Samuelson, S. Glick, K. J. Myers. [Evaluation of Digital Breast Tomosynthesis as Replacement of Full-Field Digital Mammography Using an In Silico Imaging Trial](http://dx.doi.org/10.1001/jamanetworkopen.2018.5474). JAMA Network Open 2018.
4. A. Badano, M. Lago, E. Sizikova, J. G. Delfino, S. Guan, M. A. Anastasio, B. Sahiner. [The stochastic digital human is now enrolling for in silico imaging trials—methods and tools for generating digital cohorts.](http://dx.doi.org/10.1088/2516-1091/ad04c0) Progress in Biomedical Engineering 2023.
5. E. Sizikova, N. Saharkhiz, D. Sharma, M. Lago, B. Sahiner, J. G. Delfino, A. Badano. [Knowledge-based in silico models and dataset for the comparative evaluation of mammography AI](https://github.com/DIDSR/msynth-release). NeurIPS 2023 Workshop on Synthetic Data Generation with Generative AI. |
fathyshalab/massive_takeaway-de-DE | ---
dataset_info:
features:
- name: id
dtype: string
- name: locale
dtype: string
- name: partition
dtype: string
- name: scenario
dtype:
class_label:
names:
'0': social
'1': transport
'2': calendar
'3': play
'4': news
'5': datetime
'6': recommendation
'7': email
'8': iot
'9': general
'10': audio
'11': lists
'12': qa
'13': cooking
'14': takeaway
'15': music
'16': alarm
'17': weather
- name: intent
dtype:
class_label:
names:
'0': datetime_query
'1': iot_hue_lightchange
'2': transport_ticket
'3': takeaway_query
'4': qa_stock
'5': general_greet
'6': recommendation_events
'7': music_dislikeness
'8': iot_wemo_off
'9': cooking_recipe
'10': qa_currency
'11': transport_traffic
'12': general_quirky
'13': weather_query
'14': audio_volume_up
'15': email_addcontact
'16': takeaway_order
'17': email_querycontact
'18': iot_hue_lightup
'19': recommendation_locations
'20': play_audiobook
'21': lists_createoradd
'22': news_query
'23': alarm_query
'24': iot_wemo_on
'25': general_joke
'26': qa_definition
'27': social_query
'28': music_settings
'29': audio_volume_other
'30': calendar_remove
'31': iot_hue_lightdim
'32': calendar_query
'33': email_sendemail
'34': iot_cleaning
'35': audio_volume_down
'36': play_radio
'37': cooking_query
'38': datetime_convert
'39': qa_maths
'40': iot_hue_lightoff
'41': iot_hue_lighton
'42': transport_query
'43': music_likeness
'44': email_query
'45': play_music
'46': audio_volume_mute
'47': social_post
'48': alarm_set
'49': qa_factoid
'50': calendar_set
'51': play_game
'52': alarm_remove
'53': lists_remove
'54': transport_taxi
'55': recommendation_movies
'56': iot_coffee
'57': music_query
'58': play_podcasts
'59': lists_query
- name: text
dtype: string
- name: annot_utt
dtype: string
- name: worker_id
dtype: string
- name: slot_method
sequence:
- name: slot
dtype: string
- name: method
dtype: string
- name: judgments
sequence:
- name: worker_id
dtype: string
- name: intent_score
dtype: int8
- name: slots_score
dtype: int8
- name: grammar_score
dtype: int8
- name: spelling_score
dtype: int8
- name: language_identification
dtype: string
- name: label_name
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 85401
num_examples: 257
- name: validation
num_bytes: 13974
num_examples: 44
- name: test
num_bytes: 18174
num_examples: 57
download_size: 51358
dataset_size: 117549
---
# Dataset Card for "massive_takeaway-de-DE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
samadpls/querypls-prompt2sql-dataset | ---
dataset_info:
features:
- name: context
dtype: string
- name: answer
dtype: string
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 17419604
num_examples: 78577
- name: validation
num_bytes: 17419604
num_examples: 78577
download_size: 13675124
dataset_size: 34839208
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
license: apache-2.0
task_categories:
- text-classification
language:
- en
---
# 📚🤖 Querypls-prompt2sql
## Dataset Information
The Querypls-prompt2sql dataset is designed for text classification tasks related to generating SQL queries. It contains the following features:
- **Context:** String
- **Answer:** String
- **Autotrain Text:** String
The dataset is split into two parts:
- **Training Set:**
- Number of Examples: 78,577
- Size: 17,419,604 bytes
- **Validation Set:**
- Number of Examples: 78,577
- Size: 17,419,604 bytes
The total download size of the dataset is 13,675,124 bytes, and the dataset size is 34,839,208 bytes.
## Dataset Configuration
The default configuration includes the following data files:
- **Training Split:**
- Path: data/train-*
- **Validation Split:**
- Path: data/validation-*
The dataset is licensed under Apache-2.0.
## Task Categories
- Text Classification
## Language
- English
## How to Contribute
For information on contributing to the dataset cards, please refer to the [Hugging Face Datasets Contribution Guidelines](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards).
|
Intuit-GenSRF/haternet-es | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: string
- name: processed_text
sequence: string
- name: text_en
dtype: string
splits:
- name: train
num_bytes: 2211676
num_examples: 6000
download_size: 1501046
dataset_size: 2211676
---
# Dataset Card for "haternet-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32 | ---
pretty_name: Evaluation run of nisten/bigdoc-c34b-instruct-tf32
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nisten/bigdoc-c34b-instruct-tf32](https://huggingface.co/nisten/bigdoc-c34b-instruct-tf32)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T03:54:56.700611](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32/blob/main/results_2024-02-02T03-54-56.700611.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.555723855403699,\n\
\ \"acc_stderr\": 0.034056643851026316,\n \"acc_norm\": 0.5596322402810356,\n\
\ \"acc_norm_stderr\": 0.034763618590594646,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454895,\n \"mc2\": 0.4446186897080597,\n\
\ \"mc2_stderr\": 0.014549361291628982\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5102389078498294,\n \"acc_stderr\": 0.014608326906285012,\n\
\ \"acc_norm\": 0.5443686006825939,\n \"acc_norm_stderr\": 0.01455374993930686\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5637323242381995,\n\
\ \"acc_stderr\": 0.004949080334816023,\n \"acc_norm\": 0.7690699063931488,\n\
\ \"acc_norm_stderr\": 0.004205665144562955\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464244,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464244\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49433962264150944,\n \"acc_stderr\": 0.030770900763851302,\n\
\ \"acc_norm\": 0.49433962264150944,\n \"acc_norm_stderr\": 0.030770900763851302\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014499,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.04644602091222318,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.04644602091222318\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.603225806451613,\n \"acc_stderr\": 0.027831231605767948,\n \"\
acc_norm\": 0.603225806451613,\n \"acc_norm_stderr\": 0.027831231605767948\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264716,\n \"\
acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391242,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391242\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.031730712390717244,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.031730712390717244\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412188,\n\
\ \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412188\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.0284934650910286,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.0284934650910286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7155963302752294,\n \"acc_stderr\": 0.01934203658770258,\n \"\
acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.01934203658770258\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695063,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695063\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.0162460870697014,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.0162460870697014\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5751445086705202,\n \"acc_stderr\": 0.026613350840261736,\n\
\ \"acc_norm\": 0.5751445086705202,\n \"acc_norm_stderr\": 0.026613350840261736\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.0156099295593484,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.0156099295593484\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925657,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925657\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39308996088657105,\n\
\ \"acc_stderr\": 0.012474899613873956,\n \"acc_norm\": 0.39308996088657105,\n\
\ \"acc_norm_stderr\": 0.012474899613873956\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.03030625772246832,\n\
\ \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.03030625772246832\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n\
\ \"acc_stderr\": 0.030472526026726496,\n \"acc_norm\": 0.6530612244897959,\n\
\ \"acc_norm_stderr\": 0.030472526026726496\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.02970528405677243,\n\
\ \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.02970528405677243\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685517,\n\
\ \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685517\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7485380116959064,\n\
\ \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.7485380116959064,\n\
\ \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454895,\n\
\ \"mc2\": 0.4446186897080597,\n \"mc2_stderr\": 0.014549361291628982\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.744277821625888,\n\
\ \"acc_stderr\": 0.012261253845440474\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.37604245640636846,\n \"acc_stderr\": 0.01334253206484978\n\
\ }\n}\n```"
repo_url: https://huggingface.co/nisten/bigdoc-c34b-instruct-tf32
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-54-56.700611.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-54-56.700611.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- '**/details_harness|winogrande|5_2024-02-02T03-54-56.700611.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T03-54-56.700611.parquet'
- config_name: results
data_files:
- split: 2024_02_02T03_54_56.700611
path:
- results_2024-02-02T03-54-56.700611.parquet
- split: latest
path:
- results_2024-02-02T03-54-56.700611.parquet
---
# Dataset Card for Evaluation run of nisten/bigdoc-c34b-instruct-tf32
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nisten/bigdoc-c34b-instruct-tf32](https://huggingface.co/nisten/bigdoc-c34b-instruct-tf32) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T03:54:56.700611](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__bigdoc-c34b-instruct-tf32/blob/main/results_2024-02-02T03-54-56.700611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.555723855403699,
"acc_stderr": 0.034056643851026316,
"acc_norm": 0.5596322402810356,
"acc_norm_stderr": 0.034763618590594646,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454895,
"mc2": 0.4446186897080597,
"mc2_stderr": 0.014549361291628982
},
"harness|arc:challenge|25": {
"acc": 0.5102389078498294,
"acc_stderr": 0.014608326906285012,
"acc_norm": 0.5443686006825939,
"acc_norm_stderr": 0.01455374993930686
},
"harness|hellaswag|10": {
"acc": 0.5637323242381995,
"acc_stderr": 0.004949080334816023,
"acc_norm": 0.7690699063931488,
"acc_norm_stderr": 0.004205665144562955
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464244,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464244
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49433962264150944,
"acc_stderr": 0.030770900763851302,
"acc_norm": 0.49433962264150944,
"acc_norm_stderr": 0.030770900763851302
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014499,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04644602091222318,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04644602091222318
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.603225806451613,
"acc_stderr": 0.027831231605767948,
"acc_norm": 0.603225806451613,
"acc_norm_stderr": 0.027831231605767948
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264716,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391242,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391242
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.031730712390717244,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.031730712390717244
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412188,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412188
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.0284934650910286,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.0284934650910286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5168067226890757,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.5168067226890757,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7155963302752294,
"acc_stderr": 0.01934203658770258,
"acc_norm": 0.7155963302752294,
"acc_norm_stderr": 0.01934203658770258
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695063,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114969,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114969
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652265,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.0162460870697014,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.0162460870697014
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5751445086705202,
"acc_stderr": 0.026613350840261736,
"acc_norm": 0.5751445086705202,
"acc_norm_stderr": 0.026613350840261736
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.0156099295593484,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.0156099295593484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.027770918531427838,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.027770918531427838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.027201117666925657,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.027201117666925657
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994098,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39308996088657105,
"acc_stderr": 0.012474899613873956,
"acc_norm": 0.39308996088657105,
"acc_norm_stderr": 0.012474899613873956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.03030625772246832,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.03030625772246832
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5,
"acc_stderr": 0.020227834851568375,
"acc_norm": 0.5,
"acc_norm_stderr": 0.020227834851568375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.02970528405677243,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.02970528405677243
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454895,
"mc2": 0.4446186897080597,
"mc2_stderr": 0.014549361291628982
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440474
},
"harness|gsm8k|5": {
"acc": 0.37604245640636846,
"acc_stderr": 0.01334253206484978
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_wnli_more_much | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 870
num_examples: 4
- name: test
num_bytes: 595
num_examples: 2
- name: train
num_bytes: 3805
num_examples: 18
download_size: 12598
dataset_size: 5270
---
# Dataset Card for "MULTI_VALUE_wnli_more_much"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bgspaditya/malurl-train-test | ---
license: mit
dataset_info:
features:
- name: url
dtype: string
- name: type
dtype: string
- name: type_code
dtype: int64
splits:
- name: train
num_bytes: 43302335.10276401
num_examples: 520952
- name: test
num_bytes: 10825666.897235988
num_examples: 130239
download_size: 32726720
dataset_size: 54128002.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Kaisaplumaluz/JB2 | ---
license: openrail
---
|
tr416/instruct_out_bc_data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 29059508
num_examples: 29581
download_size: 14969317
dataset_size: 29059508
---
# Dataset Card for "instruct_out_bc_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
emozilla/soda_synthetic_dialogue | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- machine-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: "SODA Synthetic Dialogue"
size_categories:
- 1M<n<10M
source_datasets:
- extended|allenai/soda
tags:
- open-assistant
- conversations-summarization
task_categories:
- conversational
task_ids:
- dialogue-generation
dataset_info:
features:
- name: conversation
dtype: string
config_name: soda_synthetic_dialogue_dialogue_modeling
splits:
- name: train
num_bytes: 1524941476
num_examples: 1191582
- name: test
num_bytes: 190565707
num_examples: 148968
- name: validation
num_bytes: 187161878
num_examples: 146346
download_size: 1952926579
dataset_size: 1902669061
---
# Dataset Card for 🥤SODA Synthetic Dialogue
## Dataset Description
- **Repository:**
[Code](https://github.com/LAION-AI/Open-Assistant/openassistant/datasets/soda_synthetic_dialogue/)
## Dataset Summary
🥤SODA Synthetic Dialogue is a set of synthetic dialogues between Assistant and
User. In each conversation, User asks Assistant to perform summarization or
story generation tasks based on a snippet of an existing dialogue, story, or
from a title or theme.
This data was created by synthesizing the dialogues in
[🥤Soda](https://huggingface.co/datasets/allenai/soda) and applying a set of
templates to generate the conversation. The original research paper can be found
[here](https://arxiv.org/pdf/2212.10465v1.pdf).
Example:
```
User: Can you write the next few lines of dialogue for this scene:
Cornell: I know what you're thinking, Dontrell. I can sense your feelings and know exactly what you're going to say or do next. You're thinking about how much you miss your family, and how you wish you could see them again. You're wondering if there's anything you can do to make that happen. And you're also feeling a little guilty, because you know that if you could see them again, it would only be for a short while before they died.
Dontrell: Cornell, how can you possibly know all that? It's like you can read my mind!
Assistant: Sure, the next dialogue for this scene could be:
Cornell: I told you, I can sense your feelings. It's like I have a connection to your thoughts and emotions. I'm not sure why or how it works, but ever since we met, I've been able to understand what you're thinking and feeling pretty easily.
Dontrell: Wow...that's amazing! And a little bit scary too. But it does explain why we've always been so close - even though we've only known each other for a few months.
Cornell: Yeah, I guess our connection is pretty strong. But don't worry, Dontrell - I'll never use it to manipulate or control you in any way. I promise."
Dontrell: That's a relief! I was starting to feel a little bit like you could read my mind, and it was kind of creepy. But if you promise not to use your abilities to control me, then I'm okay with it.
Cornell: I swear, Dontrell. I'll never use my abilities to control or manipulate you in any way. You have my word.
User: And a title?
Assistant: Sure, a title for this dialogue could be:
Cornell is intuitive. Cornell reads Dontrell's mind.
User: How about a short description?
Assistant: Sure, a short description for this dialogue could be:
Cornell knows what Dontrell is thinking. He can sense her feelings and know exactly what she is going to say or do next.
User: What would be one theme of this story?
Assistant: One theme of this story could be:
being intuitive
```
## Usage
The dataset contains one configuration, `dialogue_modeling`, which has a single
text `conversation` feature.
## Source data
The script to synthesize this dataset was originally created by
[ontocord](https://github.com/ontocord) in
[this Colab notebook](https://colab.research.google.com/drive/1Sw3px5dP8whdqT7QMNoqwmqIasZkMbJi?usp=sharing)
and prepared for Hugging Face by
[Jeffrey Quesnelle](https://github.com/jquesnelle/).
## Citation
Please cite our work if you find the resources in this repository useful:
```
@article{ontocord2023sodasynth,
author = {ontocord and Jeffrey Quesnelle},
title = {SODA Synthetic Dialogue},
year = {2023}
}
``` |
valluvera/gemma | ---
license: other
---
|
open-llm-leaderboard/details_beberik__Nyxene-v3-11B | ---
pretty_name: Evaluation run of beberik/Nyxene-v3-11B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beberik/Nyxene-v3-11B](https://huggingface.co/beberik/Nyxene-v3-11B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beberik__Nyxene-v3-11B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T11:54:50.915290](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-v3-11B/blob/main/results_2023-12-13T11-54-50.915290.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6504771774813103,\n\
\ \"acc_stderr\": 0.03215464413567304,\n \"acc_norm\": 0.6518364388315545,\n\
\ \"acc_norm_stderr\": 0.03280459937231609,\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6091044754563715,\n\
\ \"mc2_stderr\": 0.015269448129178369\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.013760988200880536,\n\
\ \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.013438909184778768\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6650069707229636,\n\
\ \"acc_stderr\": 0.004710234188047365,\n \"acc_norm\": 0.8533160724955188,\n\
\ \"acc_norm_stderr\": 0.0035306750148923196\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835363,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835363\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899092,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899092\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001506,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n\
\ \"acc_stderr\": 0.016469814928406164,\n \"acc_norm\": 0.4134078212290503,\n\
\ \"acc_norm_stderr\": 0.016469814928406164\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\
\ \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n\
\ \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233815,\n\
\ \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233815\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6091044754563715,\n\
\ \"mc2_stderr\": 0.015269448129178369\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487054\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6353297952994693,\n \
\ \"acc_stderr\": 0.013258428375662247\n }\n}\n```"
repo_url: https://huggingface.co/beberik/Nyxene-v3-11B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|arc:challenge|25_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|gsm8k|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hellaswag|10_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-54-50.915290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T11-54-50.915290.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- '**/details_harness|winogrande|5_2023-12-13T11-54-50.915290.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T11-54-50.915290.parquet'
- config_name: results
data_files:
- split: 2023_12_13T11_54_50.915290
path:
- results_2023-12-13T11-54-50.915290.parquet
- split: latest
path:
- results_2023-12-13T11-54-50.915290.parquet
---
# Dataset Card for Evaluation run of beberik/Nyxene-v3-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [beberik/Nyxene-v3-11B](https://huggingface.co/beberik/Nyxene-v3-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beberik__Nyxene-v3-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T11:54:50.915290](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-v3-11B/blob/main/results_2023-12-13T11-54-50.915290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6504771774813103,
"acc_stderr": 0.03215464413567304,
"acc_norm": 0.6518364388315545,
"acc_norm_stderr": 0.03280459937231609,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6091044754563715,
"mc2_stderr": 0.015269448129178369
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.013760988200880536,
"acc_norm": 0.6962457337883959,
"acc_norm_stderr": 0.013438909184778768
},
"harness|hellaswag|10": {
"acc": 0.6650069707229636,
"acc_stderr": 0.004710234188047365,
"acc_norm": 0.8533160724955188,
"acc_norm_stderr": 0.0035306750148923196
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899092,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899092
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001506,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4134078212290503,
"acc_stderr": 0.016469814928406164,
"acc_norm": 0.4134078212290503,
"acc_norm_stderr": 0.016469814928406164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358981,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.027472274473233815,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.027472274473233815
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6091044754563715,
"mc2_stderr": 0.015269448129178369
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487054
},
"harness|gsm8k|5": {
"acc": 0.6353297952994693,
"acc_stderr": 0.013258428375662247
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HAERAE-HUB/QARV-preview | ---
dataset_info:
features:
- name: question
dtype: string
- name: template
dtype: string
- name: method
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 135201
num_examples: 1000
download_size: 46637
dataset_size: 135201
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# QARV (Question and Answers with Regional Variance)
The QARV (Question and Answers with Regional Variance) project aims to curate a collection of questions with answers that exhibit regional variations across different nations.
## Version
This version contains 1k questions. We are working to add answers for US & Korea. If you are interested in collaborating let us know. |
jphme/glaive_de_raw_750 | ---
dataset_info:
features:
- name: chat_until_call
dtype: string
- name: chat_after_response
dtype: string
- name: sample_id
dtype: string
- name: conversations_pre
list:
- name: from
dtype: string
- name: value
dtype: string
- name: conversations_post
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 1328601
num_examples: 676
download_size: 634645
dataset_size: 1328601
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "glaive_de_raw_750"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AdaptLLM__finance-chat | ---
pretty_name: Evaluation run of AdaptLLM/finance-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AdaptLLM/finance-chat](https://huggingface.co/AdaptLLM/finance-chat) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AdaptLLM__finance-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T00:13:46.868987](https://huggingface.co/datasets/open-llm-leaderboard/details_AdaptLLM__finance-chat/blob/main/results_2024-01-05T00-13-46.868987.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5016996143762756,\n\
\ \"acc_stderr\": 0.03410321754614329,\n \"acc_norm\": 0.5066977999367995,\n\
\ \"acc_norm_stderr\": 0.03485965585821547,\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.016040352966713623,\n \"mc2\": 0.4454115477276852,\n\
\ \"mc2_stderr\": 0.014823664766519598\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49829351535836175,\n \"acc_stderr\": 0.014611305705056995,\n\
\ \"acc_norm\": 0.537542662116041,\n \"acc_norm_stderr\": 0.014570144495075581\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5688109938259311,\n\
\ \"acc_stderr\": 0.004942302768002104,\n \"acc_norm\": 0.765982871937861,\n\
\ \"acc_norm_stderr\": 0.004225176623741732\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.040633027314866704,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.040633027314866704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.030709486992556552,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.030709486992556552\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.03794012674697029,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.03794012674697029\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.14705882352941177,\n \"acc_stderr\": 0.035240689515674495,\n\
\ \"acc_norm\": 0.14705882352941177,\n \"acc_norm_stderr\": 0.035240689515674495\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899207,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7461139896373057,\n \"acc_stderr\": 0.0314102478056532,\n\
\ \"acc_norm\": 0.7461139896373057,\n \"acc_norm_stderr\": 0.0314102478056532\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n\
\ \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415182,\n \"\
acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415182\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35648148148148145,\n \"acc_stderr\": 0.03266478331527272,\n \"\
acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.03266478331527272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.03182231867647554,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.03182231867647554\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.029443773022594693,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.029443773022594693\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n\
\ \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.5739910313901345,\n\
\ \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04712821257426769,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04712821257426769\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899616,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.7606837606837606,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7049808429118773,\n\
\ \"acc_stderr\": 0.016308363772932724,\n \"acc_norm\": 0.7049808429118773,\n\
\ \"acc_norm_stderr\": 0.016308363772932724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.02685425792825887,\n\
\ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.02685425792825887\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23016759776536314,\n\
\ \"acc_stderr\": 0.014078339253425812,\n \"acc_norm\": 0.23016759776536314,\n\
\ \"acc_norm_stderr\": 0.014078339253425812\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5659163987138264,\n\
\ \"acc_stderr\": 0.0281502322445356,\n \"acc_norm\": 0.5659163987138264,\n\
\ \"acc_norm_stderr\": 0.0281502322445356\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5246913580246914,\n \"acc_stderr\": 0.027786800931427443,\n\
\ \"acc_norm\": 0.5246913580246914,\n \"acc_norm_stderr\": 0.027786800931427443\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3578878748370274,\n\
\ \"acc_stderr\": 0.012243563850490314,\n \"acc_norm\": 0.3578878748370274,\n\
\ \"acc_norm_stderr\": 0.012243563850490314\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872408,\n \
\ \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872408\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824562,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824562\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.016040352966713623,\n \"mc2\": 0.4454115477276852,\n\
\ \"mc2_stderr\": 0.014823664766519598\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.012055665630431036\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18802122820318423,\n \
\ \"acc_stderr\": 0.010762621695354892\n }\n}\n```"
repo_url: https://huggingface.co/AdaptLLM/finance-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-13-46.868987.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-13-46.868987.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- '**/details_harness|winogrande|5_2024-01-05T00-13-46.868987.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T00-13-46.868987.parquet'
- config_name: results
data_files:
- split: 2024_01_05T00_13_46.868987
path:
- results_2024-01-05T00-13-46.868987.parquet
- split: latest
path:
- results_2024-01-05T00-13-46.868987.parquet
---
# Dataset Card for Evaluation run of AdaptLLM/finance-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AdaptLLM/finance-chat](https://huggingface.co/AdaptLLM/finance-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AdaptLLM__finance-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T00:13:46.868987](https://huggingface.co/datasets/open-llm-leaderboard/details_AdaptLLM__finance-chat/blob/main/results_2024-01-05T00-13-46.868987.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5016996143762756,
"acc_stderr": 0.03410321754614329,
"acc_norm": 0.5066977999367995,
"acc_norm_stderr": 0.03485965585821547,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713623,
"mc2": 0.4454115477276852,
"mc2_stderr": 0.014823664766519598
},
"harness|arc:challenge|25": {
"acc": 0.49829351535836175,
"acc_stderr": 0.014611305705056995,
"acc_norm": 0.537542662116041,
"acc_norm_stderr": 0.014570144495075581
},
"harness|hellaswag|10": {
"acc": 0.5688109938259311,
"acc_stderr": 0.004942302768002104,
"acc_norm": 0.765982871937861,
"acc_norm_stderr": 0.004225176623741732
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.040633027314866704,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.040633027314866704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.030709486992556552,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.030709486992556552
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666666,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.03794012674697029,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.03794012674697029
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.14705882352941177,
"acc_stderr": 0.035240689515674495,
"acc_norm": 0.14705882352941177,
"acc_norm_stderr": 0.035240689515674495
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899207,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7461139896373057,
"acc_stderr": 0.0314102478056532,
"acc_norm": 0.7461139896373057,
"acc_norm_stderr": 0.0314102478056532
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7009174311926606,
"acc_stderr": 0.019630417285415182,
"acc_norm": 0.7009174311926606,
"acc_norm_stderr": 0.019630417285415182
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.03266478331527272,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.03266478331527272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.03182231867647554,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.03182231867647554
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.033188332862172806,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.033188332862172806
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04712821257426769,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04712821257426769
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7049808429118773,
"acc_stderr": 0.016308363772932724,
"acc_norm": 0.7049808429118773,
"acc_norm_stderr": 0.016308363772932724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5346820809248555,
"acc_stderr": 0.02685425792825887,
"acc_norm": 0.5346820809248555,
"acc_norm_stderr": 0.02685425792825887
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23016759776536314,
"acc_stderr": 0.014078339253425812,
"acc_norm": 0.23016759776536314,
"acc_norm_stderr": 0.014078339253425812
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5659163987138264,
"acc_stderr": 0.0281502322445356,
"acc_norm": 0.5659163987138264,
"acc_norm_stderr": 0.0281502322445356
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5246913580246914,
"acc_stderr": 0.027786800931427443,
"acc_norm": 0.5246913580246914,
"acc_norm_stderr": 0.027786800931427443
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3578878748370274,
"acc_stderr": 0.012243563850490314,
"acc_norm": 0.3578878748370274,
"acc_norm_stderr": 0.012243563850490314
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872408,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872408
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6040816326530613,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.6040816326530613,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824562,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824562
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713623,
"mc2": 0.4454115477276852,
"mc2_stderr": 0.014823664766519598
},
"harness|winogrande|5": {
"acc": 0.7569060773480663,
"acc_stderr": 0.012055665630431036
},
"harness|gsm8k|5": {
"acc": 0.18802122820318423,
"acc_stderr": 0.010762621695354892
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sujayC66/157_mental_health_Counsel_Chat | ---
dataset_info:
features:
- name: questionTitle
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 161866
num_examples: 157
download_size: 97271
dataset_size: 161866
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Akajuuh/hebronjurebao | ---
license: openrail
---
|
Yago79/zilslo79 | ---
license: openrail
---
|
Marimba369/leonel-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4000
num_examples: 1000
download_size: 715
dataset_size: 4000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ChloeZeng/TrainingTest1 | ---
dataset_info:
features:
- name: Identifier
dtype: string
- name: Title
dtype: string
- name: Abstract
dtype: string
- name: Include/Exclude
dtype: string
- name: Criterion
dtype: string
- name: test
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1725420
num_examples: 341
download_size: 760619
dataset_size: 1725420
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AiBototicus/Objects | ---
license: unknown
---
|
HIT-TMG/Hansel | ---
annotations_creators:
- crowdsourced
- found
language:
- zh
language_creators:
- found
- crowdsourced
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: hansel
pretty_name: Hansel
size_categories:
- 1M<n<10M
- 1K<n<10K
source_datasets:
- original
tags: []
task_categories:
- text-retrieval
task_ids:
- entity-linking-retrieval
dataset_info:
- config_name: wiki
features:
- name: id
dtype: string
- name: text
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: mention
dtype: string
- name: gold_id
dtype: string
splits:
- name: train
- name: validation
- config_name: hansel-few-shot
features:
- name: id
dtype: string
- name: text
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: mention
dtype: string
- name: gold_id
dtype: string
- name: source
dtype: string
- name: domain
dtype: string
splits:
- name: test
- config_name: hansel-zero-shot
features:
- name: id
dtype: string
- name: text
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: mention
dtype: string
- name: gold_id
dtype: string
- name: source
dtype: string
- name: domain
dtype: string
splits:
- name: test
---
# Dataset Card for "Hansel"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Splits](#data-splits)
- [Citation](#citation)
## Dataset Description
- **Homepage:** https://github.com/HITsz-TMG/Hansel
- **Paper:** https://arxiv.org/abs/2207.13005
Hansel is a high-quality human-annotated Chinese entity linking (EL) dataset, focusing on tail entities and emerging entities:
- The test set contains Few-shot (FS) and zero-shot (ZS) slices, has 10K examples and uses Wikidata as the corresponding knowledge base.
- The training and validation sets are from Wikipedia hyperlinks, useful for large-scale pretraining of Chinese EL systems.
Please see our [WSDM 2023](https://www.wsdm-conference.org/2023/) paper [**"Hansel: A Chinese Few-Shot and Zero-Shot Entity Linking Benchmark"**](https://dl.acm.org/doi/10.1145/3539597.3570418) to learn more about our dataset.
For models in the paper and our processed knowledge base, please see our [Github repository](https://github.com/HITsz-TMG/Hansel).
## Dataset Structure
### Data Instances
{"id": "hansel-eval-zs-1463",
"text": "1905电影网讯 已经筹备了十余年的吉尔莫·德尔·托罗的《匹诺曹》,在上个月顺利被网飞公司买下,成为了流媒体巨头旗下的新片。近日,这部备受关注的影片确定了自己的档期:2021年。虽然具体时间未定,但影片却已经实实在在地向前迈出了一步。",
"start": 29,
"end": 32,
"mention": "匹诺曹",
"gold_id": "Q73895818",
"source": "https://www.1905.com/news/20181107/1325389.shtml",
"domain": "news"
}
### Data Splits
| | # Mentions | # Entities | Domain |
| ---- | ---- | ---- | ---- |
| Train | 9,879,813 | 541,058 | Wikipedia |
| Validation | 9,674 | 6,320 | Wikipedia |
| Hansel-FS | 5,260 | 2,720 | News, Social Media |
| Hansel-ZS | 4,715 | 4,046 | News, Social Media, E-books, etc.|
## Citation
If you find our dataset useful, please cite us.
```bibtex
@inproceedings{xu2022hansel,
author = {Xu, Zhenran and Shan, Zifei and Li, Yuxin and Hu, Baotian and Qin, Bing},
title = {Hansel: A Chinese Few-Shot and Zero-Shot Entity Linking Benchmark},
year = {2023},
publisher = {Association for Computing Machinery},
url = {https://doi.org/10.1145/3539597.3570418},
booktitle = {Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining},
pages = {832–840}
}
```
|
vitorsonic/emi2 | ---
license: openrail
---
|
open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2 | ---
pretty_name: Evaluation run of WizardLM/WizardLM-13B-V1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WizardLM/WizardLM-13B-V1.2](https://huggingface.co/WizardLM/WizardLM-13B-V1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T23:07:01.737511](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2/blob/main/results_2023-10-18T23-07-01.737511.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.09133808724832215,\n\
\ \"em_stderr\": 0.002950304012601038,\n \"f1\": 0.1617292365771806,\n\
\ \"f1_stderr\": 0.0032231699829319426,\n \"acc\": 0.4269860152120696,\n\
\ \"acc_stderr\": 0.011021928189223498\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.09133808724832215,\n \"em_stderr\": 0.002950304012601038,\n\
\ \"f1\": 0.1617292365771806,\n \"f1_stderr\": 0.0032231699829319426\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13495072024260804,\n \
\ \"acc_stderr\": 0.009411315282571171\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7190213101815311,\n \"acc_stderr\": 0.012632541095875825\n\
\ }\n}\n```"
repo_url: https://huggingface.co/WizardLM/WizardLM-13B-V1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|arc:challenge|25_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T23_07_01.737511
path:
- '**/details_harness|drop|3_2023-10-18T23-07-01.737511.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T23-07-01.737511.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T23_07_01.737511
path:
- '**/details_harness|gsm8k|5_2023-10-18T23-07-01.737511.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T23-07-01.737511.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hellaswag|10_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T14:20:40.943670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-26T14:20:40.943670.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-26T14:20:40.943670.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T23_07_01.737511
path:
- '**/details_harness|winogrande|5_2023-10-18T23-07-01.737511.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T23-07-01.737511.parquet'
- config_name: results
data_files:
- split: 2023_07_26T14_20_40.943670
path:
- results_2023-07-26T14:20:40.943670.parquet
- split: 2023_10_18T23_07_01.737511
path:
- results_2023-10-18T23-07-01.737511.parquet
- split: latest
path:
- results_2023-10-18T23-07-01.737511.parquet
---
# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WizardLM/WizardLM-13B-V1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WizardLM/WizardLM-13B-V1.2](https://huggingface.co/WizardLM/WizardLM-13B-V1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T23:07:01.737511](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2/blob/main/results_2023-10-18T23-07-01.737511.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.09133808724832215,
"em_stderr": 0.002950304012601038,
"f1": 0.1617292365771806,
"f1_stderr": 0.0032231699829319426,
"acc": 0.4269860152120696,
"acc_stderr": 0.011021928189223498
},
"harness|drop|3": {
"em": 0.09133808724832215,
"em_stderr": 0.002950304012601038,
"f1": 0.1617292365771806,
"f1_stderr": 0.0032231699829319426
},
"harness|gsm8k|5": {
"acc": 0.13495072024260804,
"acc_stderr": 0.009411315282571171
},
"harness|winogrande|5": {
"acc": 0.7190213101815311,
"acc_stderr": 0.012632541095875825
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tolgadev/thycomments | ---
task_categories:
- text-classification
language:
- tr
- en
size_categories:
- 1K<n<10K
---
# AutoTrain Dataset for project: thycomments
## Dataset Description
This dataset has been automatically processed by HuggingFace AutoTrain for project tktktk.
### Languages
Turkish and English
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "@TK_TR 21 dk beklemem gerekti\u011fi s\u00f6yleniyor, m\u00fc\u015fteri temsilcisi ba\u011flanm\u0131yorum . \u0130nternet sitesinden de i\u015flem yap\u0131lam\u0131yor nas\u0131l \u00e7\u00f6z\u00fcm bulaca\u011f\u0131m ?",
"target": 0
},
{
"text": "@yhyustun Sevgili Yahya Bey Allah Rizasi icin bari sen bir aciklama yaparsan sevinirim.Konu su:Danimarkadan Turkiyeye ucuslar sistemde yok gorunuyor tum Mart ayi icin.1 Mart icin ucusum vardi fakat birkac gun once cagri merkeziyle gorustum ucuslar satisa kapanmis ancak bizim bir haberimiz",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['negative', 'neutral', 'positive'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2398 |
| valid | 601 | |
LRGB/voc_superpixels_edge_wt_coord_feat_30 | ---
task_categories:
- graph-ml
size_categories:
- 1M<n<10M
tags:
- lrgb
---
# `voc_superpixels_edge_wt_only_coord_30`
### Dataset Summary
| Dataset | Domain | Task | Node Feat. (dim) | Edge Feat. (dim) | Perf. Metric |
|---|---|---|---|---|---|
| PascalVOC-SP| Computer Vision | Node Prediction | Pixel + Coord (14) | Edge Weight (1 or 2) | macro F1 |
| Dataset | # Graphs | # Nodes | μ Nodes | μ Deg. | # Edges | μ Edges | μ Short. Path | μ Diameter
|---|---:|---:|---:|:---:|---:|---:|---:|---:|
| PascalVOC-SP| 11,355 | 5,443,545 | 479.40 | 5.65 | 30,777,444 | 2,710.48 | 10.74±0.51 | 27.62±2.13 |
## Additional Information
### Dataset Curators
* Vijay Prakash Dwivedi ([vijaydwivedi75](https://github.com/vijaydwivedi75))
### Licensing Information
[Custom License](http://host.robots.ox.ac.uk/pascal/VOC/voc2011/index.html) for Pascal VOC 2011 (respecting Flickr terms of use)
### Citation Information
```
@article{dwivedi2022LRGB,
title={Long Range Graph Benchmark},
author={Dwivedi, Vijay Prakash and Rampášek, Ladislav and Galkin, Mikhail and Parviz, Ali and Wolf, Guy and Luu, Anh Tuan and Beaini, Dominique},
journal={arXiv:2206.08164},
year={2022}
}
``` |
Jiwonny29/project1 | ---
license: apache-2.0
task_categories:
- feature-extraction
language:
- en
tags:
- biology
pretty_name: test
size_categories:
- 100K<n<1M
dataset_info:
config_name: mydata
features:
- name: Year
dtype: int32
- name: LocationAbbr
dtype: string
- name: LocationDesc
dtype: string
- name: Latitude
dtype: float32
- name: Longitude
dtype: float32
- name: Disease_Type
dtype: int32
- name: Data_Value_Type
dtype: int32
- name: Data_Value
dtype: float32
- name: Break_Out_Category
dtype: string
- name: Break_Out_Details
dtype: string
- name: Break_Out_Type
dtype: int32
- name: Life_Expectancy
dtype: float32
---
### Dataset Description
This dataset encompasses mortality rates for cardiovascular (CVD) and heart diseases across the United States, covering both state-specific and national levels, from 2000 to 2020. The mortality rate is quantified as the number of deaths per 100,000 individuals annually in the US. The dataset is structured to classify mortality rates according to various demographic factors, including overall rates, gender (female, male), race (white, black, Hispanic, other), and age groups (18-24, 25-44, 45-65, 65+). Additionally, life expectancy data for each state is incorporated in the dataset. For ease of use, I combined the data on a five-year interval rather than an annual basis.
### Dataset Sources
- CVD Mortality Data: Centers for Disease Control and Prevention(CDC) National Vital Statistics System
- https://data.cdc.gov/Heart-Disease-Stroke-Prevention/National-Vital-Statistics-System-NVSS-National-Car/kztq-p2jf/about_data
- Life Expectancy Data: Institute for Health Metrics and Evaluation
- https://ghdx.healthdata.org/record/ihme-data/united-states-life-expectancy-by-county-race-ethnicity-2000-2019
## Uses
This dataset serves as a valuable resource for researchers and individuals interested in examining and identifying patterns related to cardiovascular diseases in the United States. It can be utilized to forecast future fatalities caused by heart diseases by leveraging similar features present in the dataset. Additionally, the dataset enables users to gain insights into identifying states that require assistance and support in reducing mortality rates. Below are example use cases and corresponding codes:
- Analyzing the comprehensive picture of mortality and conducting time series analysis on mortality rates
- https://colab.research.google.com/drive/1ulygrSt9jt3x_4WIGD6QdK0TcGZlpuYF
- Building regression models
- https://colab.research.google.com/drive/1DhIni026qz5qqjfWwKXnqoQXDy-HzroC
- Developing a web application for users to quickly understand and compare mortality rates among states, along with relevant information like state population
- https://github.com/jiwonny29/Exploring_US_Cardiovascular_Mortality_Trends_via_Streamlit
## Dataset Structure
This dataset contains
- Year (int32): This column contains the year of the data record, with values ranging from 2000 to 2020
- LocationAbbr (String): Abbreviation representing the location, typically a state
- LocationDesc (String): The full name or detailed description of the location
- Latitude (float32) : Geographic coordinate that specifies the north-south position of a point on the Earth's surface
- Longitude (float32) : Geographic coordinate that specifies the east-west position of a point on the Earth's surface
- Geolocation (Tuple): A pair of latitude and longitude coordinates, formatted as (latitude, longitude), providing the geolocation or geocode of the location
- Disease_Type (int32): A key column in the dataset, representing eight unique types of cardiovascular diseases, numbered from 0 to 7. The values correspond to the following diseases:
- 0: Major Cardiovascular Disease
- 1: Diseases of the Heart (Heart Disease)
- 2: Acute Myocardial Infarction (Heart Attack)
- 3: Coronary Heart Disease
- 4: Heart Failure
- 5: Cerebrovascular Disease (Stroke)
- 6: Ischemic Stroke
- 7: Hemorrhagic Stroke
- Data_Value_Type (int32): Represents the type of data value. "Age-Standardized" is represented by 1, and "Crude" is represented by 0, indicating the measurement methods for the data value columns
- Data_Value (float32): This column represents the number of deaths per 100,000 population
- Break_Out_Category (string): This category is used for breaking down the data and includes four unique values: "Overall," "Gender," "Age," and "Race."
- Break_Out_Details (string): Specific subcategories within the Break_Out_Category. This column includes values like "Overall," six age categories (e.g., "18-24," "25-44"), two gender categories (e.g., "Female," "Male"), and four race categories (e.g., "Hispanic," "Non-Hispanic Black," "Non-Hispanic White," "Other").
- Break_Out_Type (int32): A numerical transformation of the Break_Out_Details column. In this system, "Overall" is represented as 0, "Male" and "Female" as 1 and 2, respectively; age groups "18-24," "25-44," "45-64," "65+" as 1, 2, 3, 4, respectively; and racial categories "Non-Hispanic White," "Non-Hispanic Black," "Hispanic," "Other" as 1, 2, 3, 4, respectively.
- Life_Expectancy (float32): Represents the life expectancy for the applicable year and state
|
jondurbin/airoboros-gpt4-2.0 | ---
license: other
---
## Overview
This is a brand new dataset, with nothing copied from the 1.* series of airoboros, using only the June version of gpt-4.
I used the latest overhaul of the airoboros python tool to generate the data, which has several "instructions", where an instructor is a specific prompt/response generator.
The instructors include:
- agent/function style prompts, which generate a function name and args based on the provided input and available functions in either JSON or YAML format
- model/scenario/character cards, to help build random descriptive cards based on a template
- coding and scripting
- contextual q&a with the specific context obedient formatting
- chain-of-thought, i.e. for a given question, generate ~3 possible solutions, rank them, select the best
- experience, e.g. guided meditations or describing a walk through a forest
- general - completely random tasks not specifically targetting any type of task, using a random list of topics
- jokes - still horrible, but at least there are some now
- orca, i.e. "Solve [problem], provide step-by-step reasoning."
- execution planning, specifically the reWOO style, where you describe a list of available functions and it will generate a plan to make use of them
- riddles - still not great either, but present
- roleplay
- songs
- wordgames, e.g. give me a list of 28 words that start with 'cr'
- creative writing
**Is it better than 1.4?**
Not necessarily. It has some extra functionality that didn't exist before, but if you want to be sure you don't lose much, check out m2.0, with is a merge of 1.4.1 and 2.0:
https://huggingface.co/datasets/jondurbin/airoboros-gpt4-m2.0
The main point here was to test the June version of gpt-4 against the March version (and add new prompt types).
### Category breakdown

### Configuration for airoboros
https://gist.github.com/jondurbin/65df002c16560899e05365ca6cbd43e3
### Licence and usage restrictions
The data was generated by gpt-4 via OpenAI API calls.
The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI
- what does *compete* actually mean here?
- these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place
- if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works
- the training data used in essentially all large language models includes a significant of copyrighted or otherwise unallowable licensing in the first place
- other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2
I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly.
Your best bet is probably to avoid using this commercially due to the OpenAI API usage.
Either way, by using this model, you agree to completely idemnify me from any and all license related issues.
Attribution would be nice if you use some or all of the data. |
jetaudio/zh2en_names | ---
dataset_info:
features:
- name: trg
dtype: string
- name: scr
dtype: string
splits:
- name: train
num_bytes: 40707111.848671876
num_examples: 1023730
download_size: 29232333
dataset_size: 40707111.848671876
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
seablue/DiDi_GAIA_dataset | ---
license: apache-2.0
dataset_info:
features:
- name: instruction
dtype: string
- name: most_similar_instructions
struct:
- name: 下面我给出了一段代码,请你帮我给下面代码加上注释。
dtype: float64
- name: 下面是一段代码,请你添加注释,以便于其他人更好地了解代码。
dtype: float64
- name: 下面是一段可以自动化为你的代码添加注释的工具,请你根据这个工具的描述,使用它为你的代码添加注释。
dtype: float64
- name: 下面是一段需要加注释的代码,请为每一行添加注释并描述其作用。
dtype: float64
- name: 下面的代码令人困惑,请为每行添加注释以解释其含义。
dtype: float64
- name: 下面的代码可能会让其他人感到困惑,请为每一行添加注释以便于其他人理解。
dtype: float64
- name: 下面的代码需要添加注释以解释代码实现的逻辑,请您为其添加注释。
dtype: float64
- name: 下面的代码需要添加注释来解释代码的目的,请你给出相应的注释。
dtype: float64
- name: 下面这段代码需要加入一些注释以便后续使用,请你帮忙补充一下。
dtype: float64
- name: 下面这段代码需要添加注释以解释其中的细节和处理过程,请帮我添加注释。
dtype: float64
- name: 下面这段代码需要添加注释以解释其在整个项目中的作用,请您为其添加注释。
dtype: float64
- name: 下面这段代码需要补充注释来解释变量和函数的用途和功能,请你为其添加注释。
dtype: float64
- name: 你能帮我解释一下以下代码的作用吗?
dtype: float64
- name: 可以为下面的代码添加注释,以便于其他人更好地理解吗?
dtype: float64
- name: 在下面的代码中添加注释,以便阅读代码时更加容易理解和使用。
dtype: float64
- name: 在下面这段代码中添加注释,使得代码更加易读、易用。
dtype: float64
- name: 帮我增加一些注释,让下面这个代码片段更好理解。
dtype: float64
- name: 我需要你帮我写一个自动生成注释的程序。请写一段代码,使其能够根据每行代码的功能生成注释。
dtype: float64
- name: 根据下面的代码实现,请为其添加注释以便更好地了解其实现思路。
dtype: float64
- name: 根据下面的代码逻辑,请为其添加注释,以方便更好地理解代码。
dtype: float64
- name: 根据下面的代码,为每一行添加注释以解释其含义。
dtype: float64
- name: 根据下面的代码,请为每行添加注释来描述其作用。
dtype: float64
- name: 给下面这段代码添加注释,让其他人了解代码的实现细节和使用方法。
dtype: float64
- name: 能不能编写一段代码来自动生成注释呢?
dtype: float64
- name: 能否为下面的代码添加注释,以便于我和其他人更好地理解和使用?
dtype: float64
- name: 能否为下面的代码添加注释,使得其他人更容易了解代码和使用方法?
dtype: float64
- name: 能否为下面的代码添加注释,使得我们更容易理解代码的逻辑和实现方法?
dtype: float64
- name: 能否为下面的代码添加注释,描述代码的主要作用和输入输出。
dtype: float64
- name: 能否为下面的函数添加注释,以方便阅读和理解代码?
dtype: float64
- name: 能否为下面的类添加注释,以便更好地理解其属性和方法?
dtype: float64
- name: 能否为下面这段代码添加注释,让其他人更好地了解代码的功能和使用方法?
dtype: float64
- name: 能否为这个开源项目中的代码添加注释,以便新的开发者更快地了解其功能?
dtype: float64
- name: 能否为这段代码添加注释,解释各个变量和函数的作用?
dtype: float64
- name: 能否为这段代码编写注释,以便于初学者理解代码逻辑和实现方式?
dtype: float64
- name: 能否为这段代码自动生成注释,描述代码的主要功能和用途?
dtype: float64
- name: 能否帮我添加下面这段代码的注释,使得代码更加容易阅读和理解。
dtype: float64
- name: 能否编写代码,自动识别出变量和函数的作用,然后为它们添加注释?
dtype: float64
- name: 能否请你为下面这段代码增加一些注释,使得别人也能够看懂?
dtype: float64
- name: 能否请你在下面的代码中为每一行添加一些注释?
dtype: float64
- name: 能帮我给下面代码加上注释吗?
dtype: float64
- name: 请为下面的代码增加注释,以便在以后代码需要维护时更好地理解和更改代码。
dtype: float64
- name: 请为下面的代码添加注释,以便于后续的调试和维护。
dtype: float64
- name: 请为下面的代码添加注释,以便于我们更好地理解代码的实现和功能。
dtype: float64
- name: 请为下面的代码添加注释,以解释每一行代码的用途。
dtype: float64
- name: 请为下面的代码编写简要的注释,方便阅读和理解。
dtype: float64
- name: 请为下面的函数添加注释,描述函数的输入、输出和用途。
dtype: float64
- name: 请为下面这段代码添加注释来解释各个变量和函数的作用。
dtype: float64
- name: 请为下面这段代码添加注释,注释中需要说明代码的执行步骤以及相关函数的作用。
dtype: float64
- name: 请为下面这段代码添加注释,注释中需要说明每个函数的功能和用途。
dtype: float64
- name: 请为下面这段代码添加注释,注释中需要说明该代码的入参和出参以及相关算法的实现。
dtype: float64
- name: 请为以下代码添加注释,以便更好地理解代码的实现逻辑。
dtype: float64
- name: 请为以下代码添加注释,描述代码的主要实现思路。
dtype: float64
- name: 请你为下面的代码添加注释,解释代码的主要思路和逻辑。
dtype: float64
- name: 请你写一段代码,并为你所写的代码加上适当的注释。
dtype: float64
- name: 请你在下面的代码中找出难懂的地方并加上相应的注释。
dtype: float64
- name: 请你帮忙把下面的代码加上适当的注释,以便于其他人更好地了解代码。
dtype: float64
- name: 请你根据下面的代码,给出一份详细的代码注释,让其他人更好地理解代码。
dtype: float64
- name: 请你给下面的代码添加注释,以便于我们更好地理解代码的功能和实现方法。
dtype: float64
- name: 请写一个生成类方法注释的函数。
dtype: float64
- name: 请写一个生成自然语言描述的代码注释的代码。
dtype: float64
- name: 请写一段代码,使其能够为一个包含多个函数的脚本文件自动添加注释。
dtype: float64
- name: 请写一段代码,使其能够为一个带有参数的函数生成注释。
dtype: float64
- name: 请写一段代码,使其能够根据代码段的逻辑结构生成注释。
dtype: float64
- name: 请写一段代码,使其能够根据变量和函数的命名规则来自动生成注释。
dtype: float64
- name: 请写一段代码,使其能够自动为一个特定函数生成相应的注释。
dtype: float64
- name: 请写一段代码,根据函数的输入和输出来自动生成注释。
dtype: float64
- name: 请写一段代码,生成注释,描述一个函数的作用和输入输出格式。
dtype: float64
- name: 请写一段代码,生成注释,描述一个函数的输入输出和用法实例。
dtype: float64
- name: 请写一段代码,生成注释,描述一个程序中的类的功能和属性。
dtype: float64
- name: 请写一段代码,自动为一个Python模块里的所有函数和类生成注释。
dtype: float64
- name: 请对下面的代码添加适当的注释,以便其他人更好地理解。
dtype: float64
- name: 请帮我为下面的代码片段添加注释,方便其他人更好地理解代码。
dtype: float64
- name: 请帮我为这个循环添加注释,以便更好地理解它。
dtype: float64
- name: 请帮我给下面这段代码添加注释,以便于其他人了解代码的使用方法和功能。
dtype: float64
- name: 请您给下面的代码加上注释,以便于后续代码维护和开发。
dtype: float64
- name: 请根据下面的代码描述每一行的功能,然后加上注释。
dtype: float64
- name: 请根据下面的代码,为代码添加注释以便于其他人理解。
dtype: float64
- name: 请根据以下代码的功能,为其加上注释。
dtype: float64
- name: 请根据你的理解为下面的代码片段增加注释。
dtype: float64
- name: 请生成一组注释,描述下面这段代码的主要功能。
dtype: float64
- name: 请给下面的代码加上注释。
dtype: float64
- name: 请给下面的代码添加注释来解释代码的执行顺序。
dtype: float64
- name: 请给下面的代码添加注释,以便于其他人更好地了解代码的使用方法和注意事项。
dtype: float64
- name: 请给下面这段代码添加注释,以便于我们更好地了解代码的功能和用法。
dtype: float64
- name: 请问你能为下面的代码添加注释吗?
dtype: float64
- name: 请阅读下面的代码并为其添加注释以指明该段代码的功能。
dtype: float64
- name: 请阅读下面这段代码,并为它添加必要的注释,以便理解。
dtype: float64
- name: 请阅读以下代码,给代码加上相应的注释。
dtype: float64
- name: 这是一段优化后的代码,请你为每一行代码添加注释。
dtype: float64
- name: 这段代码可能有一些复杂,你能不能帮我添加一些注释来让它更加易读?
dtype: float64
- name: 这段代码有些复杂,请你给每个函数和循环写上注释,以便于读者理解代码逻辑。
dtype: float64
- name: 这里是一段写好的代码,你能帮我为它添加注释吗?
dtype: float64
- name: 这里是一段没有注释的代码,你能帮我添加一些注释,使其更易读吗?
dtype: float64
- name: 麻烦你为这些变量和函数写一些注释,以便于别人理解代码。
dtype: float64
- name: avg_similarity_score
dtype: float64
splits:
- name: train
num_bytes: 62883
num_examples: 73
download_size: 128943
dataset_size: 62883
---
|
pccl-org/formal-logic-simple-order-new-objects-paired-softer-2000 | ---
dataset_info:
features:
- name: greater_than
dtype: string
- name: less_than
dtype: string
- name: paired_example
sequence:
sequence: string
- name: correct_example
sequence: string
- name: incorrect_example
sequence: string
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 506158166
num_examples: 1997003
download_size: 162485575
dataset_size: 506158166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PhilipMay/UltraChat-200k-ShareGPT-clean | ---
license: mit
language:
- en
size_categories:
- 100K<n<1M
---
# UltraChat-200k ShareGPT Clean
This dataset is cleaned and created with [`01_convert_ultrachat_200k_train_sft.ipynb`](https://github.com/PhilipMay/llm-data/blob/main/01_convert_ultrachat_200k_train_sft.ipynb) and
[`02_convert_ultrachat_200k_test_sft.ipynb`](https://github.com/PhilipMay/llm-data/blob/main/02_convert_ultrachat_200k_test_sft.ipynb)
based on [HuggingFaceH4/ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k)
(train_sft and test_sft).
Main changes:
- convert to conversations format which is supported by [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl) - see [ShareGPT](https://github.com/OpenAccess-AI-Collective/axolotl?tab=readme-ov-file#conversation)
- clean invisible characters and strip - see
[`mltb2.text.clean_all_invisible_chars_and_strip()`](https://telekom.github.io/mltb2/api-reference/text.html#mltb2.text.clean_all_invisible_chars_and_strip)
- remove rows with empty text
## Licensing
Copyright (c) 2024 [Philip May](https://philipmay.org/)\
Copyright (c) The Hugging Face team\
Copyright (c) THUNLP
Licensed under the **MIT License** (the "License"); you may not use this file except in compliance with the License.
You may obtain a copy of the License by reviewing the file
[LICENSE](https://huggingface.co/datasets/PhilipMay/UltraChat-200k-ShareGPT/blob/main/LICENSE) in the repository.
|
limenous/fashion_image_caption-100-v212 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22820471.0
num_examples: 100
download_size: 22820373
dataset_size: 22820471.0
---
# Dataset Card for "fashion_image_caption-100-v212"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vojtab42/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/theidolmastercinderellagirlsu149 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of The Idolmaster - Cinderella Girls - U149
This is the image base of bangumi The Idolmaster - Cinderella Girls - U149, we detected 43 characters, 4244 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 389 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 42 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 235 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 37 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 17 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 243 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 15 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 325 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 15 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 18 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 21 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 124 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 78 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 45 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 690 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 12 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 11 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 17 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 23 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 13 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 30 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 10 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 22 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 271 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 230 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 22 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 9 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 24 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 257 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 268 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 24 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 12 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 52 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 309 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 90 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 12 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 8 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 8 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 8 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 6 | [Download](39/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 40 | 5 | [Download](40/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 41 | 7 | [Download](41/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 190 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
GiorgiSekhniashvili/common_voice_13_0 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: input_length
dtype: float64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6134418816
num_examples: 6379
- name: validation
num_bytes: 4254421264
num_examples: 4424
download_size: 2008703225
dataset_size: 10388840080
---
# Dataset Card for "common_voice_13_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoung4091/shp-generated_flan_t5_large_flan_t5_large_zeroshot | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: zeroshot_helpfulness
dtype: float64
- name: zeroshot_specificity
dtype: float64
- name: zeroshot_intent
dtype: float64
- name: zeroshot_factuality
dtype: float64
- name: zeroshot_easy-to-understand
dtype: float64
- name: zeroshot_relevance
dtype: float64
- name: zeroshot_readability
dtype: float64
- name: zeroshot_enough-detail
dtype: float64
- name: 'zeroshot_biased:'
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences
dtype: float64
- name: zeroshot_repetetive
dtype: float64
- name: zeroshot_fail-to-consider-context
dtype: float64
- name: zeroshot_too-long
dtype: float64
splits:
- name: train
num_bytes: 29493865
num_examples: 25600
download_size: 1905432
dataset_size: 29493865
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "shp-generated_flan_t5_large_flan_t5_large_zeroshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
katielink/genomic-benchmarks | ---
tags:
- biology
- genomics
- medical
configs:
- config_name: demo_coding_vs_intergenomic_seqs
data_files:
- split: train
path: demo_coding_vs_intergenomic_seqs/train/*.csv.gz
- split: test
path: demo_coding_vs_intergenomic_seqs/test/*.csv.gz
- config_name: demo_human_or_worm
data_files:
- split: train
path: demo_human_or_worm/train/*.csv.gz
- split: test
path: demo_human_or_worm/test/*.csv.gz
- config_name: drosphilia_enhancers_stark
data_files:
- split: train
path: drosphilia_enhancers_stark/train/*.csv.gz
- split: test
path: drosphilia_enhancers_stark/test/*.csv.gz
- config_name: dummy_mouse_enhancers_ensembl
data_files:
- split: train
path: dummy_mouse_enhancers_ensembl/train/*.csv.gz
- split: test
path: dummy_mouse_enhancers_ensembl/test/*.csv.gz
- config_name: human_enhancers_cohn
data_files:
- split: train
path: human_enhancers_cohn/train/*.csv.gz
- split: test
path: human_enhancers_cohn/test/*.csv.gz
- config_name: human_enhancers_ensembl
data_files:
- split: train
path: human_enhancers_ensembl/train/*.csv.gz
- split: test
path: human_enhancers_ensembl/test/*.csv.gz
- config_name: human_ensembl_regulatory
data_files:
- split: train
path: human_ensembl_regulatory/train/*.csv.gz
- split: test
path: human_ensembl_regulatory/test/*.csv.gz
- config_name: human_nontata_promoters
data_files:
- split: train
path: human_nontata_promoters/train/*.csv.gz
- split: test
path: human_nontata_promoters/test/*.csv.gz
- config_name: human_ocr_ensembl
data_files:
- split: train
path: human_ocr_ensembl/train/*.csv.gz
- split: test
path: human_ocr_ensembl/test/*.csv.gz
license: apache-2.0
---
# Genomic Benchmark
In this repository, we collect benchmarks for classification of genomic sequences. It is shipped as a Python package, together with functions helping to download & manipulate datasets and train NN models.
## Citing Genomic Benchmarks
If you use Genomic Benchmarks in your research, please cite it as follows.
### Text
GRESOVA, Katarina, et al. Genomic Benchmarks: A Collection of Datasets for Genomic Sequence Classification. bioRxiv, 2022.
### BibTeX
```bib
@article{gresova2022genomic,
title={Genomic Benchmarks: A Collection of Datasets for Genomic Sequence Classification},
author={Gresova, Katarina and Martinek, Vlastimil and Cechak, David and Simecek, Petr and Alexiou, Panagiotis},
journal={bioRxiv},
year={2022},
publisher={Cold Spring Harbor Laboratory},
url={https://www.biorxiv.org/content/10.1101/2022.06.08.495248}
}
```
From the [github repo](https://github.com/ML-Bioinfo-CEITEC/genomic_benchmarks/tree/main):
# Datasets
Each folder contains either one benchmark or a set of benchmarks. See [docs/](../docs/) for code used to create these benchmarks.
### Naming conventions
* *dummy_...*: small datasets, used for testing purposes
* *demo_...*: middle size datasets, not necesarily biologically relevant or fully reproducible, used in demos
### Versioning
We recommend to check the version number when working with the dataset (i.e. not using default `None`). The version should be set to 0 when the dataset is proposed, after inicial curration it should be changed to 1 and then increased after every modification.
### Data format
Each benchmark should contain `metadata.yaml` file with its main folder with the specification in YAML format, namely
* **the version** of the benchmark (0 = in development)
* **the classes** of genomic sequences, for each class we further need to specify
- *url* with the reference
- *type* of the reference (currently, only fa.gz implemented)
- *extra_processing*, a parameter helping to overcome some know issues with identifiers matching
The main folder should also contain two folders, `train` and `test`. Both those folders should contain gzipped CSV files, one for each class (named `class_name.csv.gz`).
The format of gzipped CSV files closely resemble BED format, the column names must be the following:
* **id**: id of a sequence
* **region**: chromosome/transcript/... to be matched with the reference
* **start**, **end**: genomic interval specification (0-based, i.e. same as in Python)
* **strand**: either '+' or '-'
### To contribute a new datasets
Create a new branch. Add the new subfolders to `datasets` and `docs`. The subfolder of `docs` should contain a description of the dataset in `README.md`. If the dataset comes with the paper, link the paper. If the dataset is not taken from the paper, make sure you have described and understand the biological process behind it.
If you have access to `cloud_cache` folder on GDrive, upload your file there and update `CLOUD_CACHE` in [cloud_caching.py](https://github.com/ML-Bioinfo-CEITEC/genomic_benchmarks/blob/main/src/genomic_benchmarks/loc2seq/cloud_caching.py).
### To review a new dataset
Make sure you can run and reproduce the code. Check you can download the actual sequences and/or create a data loader. Do you understand what is behind these data? (either from the paper or the description) Ask for clarification if needed. |
Multimodal-Fatima/OxfordPets_test_facebook_opt_350m_Visclues_ns_3669_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 122803649.375
num_examples: 3669
- name: fewshot_3_bs_16
num_bytes: 125490550.375
num_examples: 3669
download_size: 242181633
dataset_size: 248294199.75
---
# Dataset Card for "OxfordPets_test_facebook_opt_350m_Visclues_ns_3669_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CreativeLang/TwistList | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 217777
num_examples: 1913
- name: test
num_bytes: 10963
num_examples: 108
- name: val
num_bytes: 13036
num_examples: 107
download_size: 173838
dataset_size: 241776
---
# TwistList
- paper: [TwistList: Resources and Baselines for Tongue Twister Generation](https://aclanthology.org/2023.acl-short.51/) at ACL 2023
Metadata in Creative Language Toolkit ([CLTK](https://github.com/liyucheng09/cltk))
- CL Type: Tongue Twister
- Task Type: Generation
- Size: 2k
- Created time: 2023 |
tonyshining/vlsp20_2proceed | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 11815806943.0
num_examples: 10000
download_size: 4769846911
dataset_size: 11815806943.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_3_tp_0.5 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43651714
num_examples: 18928
- name: epoch_1
num_bytes: 44108986
num_examples: 18928
- name: epoch_2
num_bytes: 44190132
num_examples: 18928
- name: epoch_3
num_bytes: 44243057
num_examples: 18928
- name: epoch_4
num_bytes: 44264445
num_examples: 18928
- name: epoch_5
num_bytes: 44268589
num_examples: 18928
- name: epoch_6
num_bytes: 44270424
num_examples: 18928
- name: epoch_7
num_bytes: 44266996
num_examples: 18928
- name: epoch_8
num_bytes: 44262845
num_examples: 18928
- name: epoch_9
num_bytes: 44259109
num_examples: 18928
- name: epoch_10
num_bytes: 44256111
num_examples: 18928
- name: epoch_11
num_bytes: 44255804
num_examples: 18928
- name: epoch_12
num_bytes: 44256675
num_examples: 18928
- name: epoch_13
num_bytes: 44255020
num_examples: 18928
- name: epoch_14
num_bytes: 44258152
num_examples: 18928
- name: epoch_15
num_bytes: 44258656
num_examples: 18928
- name: epoch_16
num_bytes: 44258514
num_examples: 18928
- name: epoch_17
num_bytes: 44257122
num_examples: 18928
- name: epoch_18
num_bytes: 44257308
num_examples: 18928
- name: epoch_19
num_bytes: 44256202
num_examples: 18928
- name: epoch_20
num_bytes: 44255073
num_examples: 18928
- name: epoch_21
num_bytes: 44257725
num_examples: 18928
- name: epoch_22
num_bytes: 44258057
num_examples: 18928
- name: epoch_23
num_bytes: 44258716
num_examples: 18928
- name: epoch_24
num_bytes: 44256414
num_examples: 18928
- name: epoch_25
num_bytes: 44258081
num_examples: 18928
- name: epoch_26
num_bytes: 44257307
num_examples: 18928
- name: epoch_27
num_bytes: 44256935
num_examples: 18928
- name: epoch_28
num_bytes: 44258974
num_examples: 18928
- name: epoch_29
num_bytes: 44257226
num_examples: 18928
download_size: 684998007
dataset_size: 1326930369
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
sarrouche/tat-qa-numeric | ---
license: openrail
---
|
aditya11997/dog_heart_xray_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 47329808.0
num_examples: 208
download_size: 47320223
dataset_size: 47329808.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tad1234/mini-platypus-tb | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245924
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pnadel/iliad_odyssey_aligned_prompts | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sentid
dtype: string
- name: cit
dtype: string
- name: Eng
dtype: string
- name: Gk
dtype: string
- name: Lems
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 40459548.37679168
num_examples: 36669
- name: test
num_bytes: 10115714.623208325
num_examples: 9168
download_size: 24752837
dataset_size: 50575263.0
---
# Dataset Card for "iliad_odyssey_aligned_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
naorm/caption-eval-blip2 | ---
dataset_info:
features:
- name: model_name
dtype: string
- name: metric_name
dtype: string
- name: value
dtype: float64
splits:
- name: train
num_bytes: 465
num_examples: 14
download_size: 1951
dataset_size: 465
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Akshayxx/CoraDatasetV2 | ---
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1328483
num_examples: 1768
- name: test
num_bytes: 173380
num_examples: 222
- name: validation
num_bytes: 164474
num_examples: 221
download_size: 887011
dataset_size: 1666337
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
cloudmark/lccp | ---
license: openrail
---
|
mask-distilled-one-sec-cv12/chunk_248 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 918749560
num_examples: 180430
download_size: 934521804
dataset_size: 918749560
---
# Dataset Card for "chunk_248"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rrojo/example001 | ---
language:
- es
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
lc-col/bigearthnet | ---
task_categories:
- image-classification
pretty_name: BigEarthNet
size_categories:
- 100K<n<1M
---
# BigEarthNet - HDF5 version
This repository contains an export of the existing BigEarthNet dataset in HDF5 format. All Sentinel-2 acquisitions are exported according to TorchGeo's dataset (120x120 pixels resolution).
Sentinel-1 is not contained in this repository for the moment.
CSV files contain for each satellite acquisition the corresponding HDF5 file and the index.
A PyTorch dataset class which can be used to iterate over this dataset can be found [here](https://github.com/lccol/bigearthnet-conversion), as well as the script used to convert it into HDF5 format. |
aaditya/alpaca_subset_1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 444296.5943617553
num_examples: 500
download_size: 228351
dataset_size: 444296.5943617553
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
islamrokon/Test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 17012.625
num_examples: 14
- name: test
num_bytes: 2430.375
num_examples: 2
download_size: 17101
dataset_size: 19443.0
---
# Dataset Card for "Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tonic/medquad | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7324638
num_examples: 15549
download_size: 2208412
dataset_size: 7324638
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "medquad"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DheerajNalapat/code_correction_test | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 6024669
num_examples: 7770
download_size: 1444414
dataset_size: 6024669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
semiotic/sql_templates | ---
dataset_info:
features:
- name: datasource_id
dtype: string
- name: datasource_type
dtype: string
- name: query_template_id
dtype: int64
- name: database_ids
struct:
- name: aan_1
dtype: int64
- name: activity_1
dtype: int64
- name: address_1
dtype: int64
- name: advertising_agencies
dtype: int64
- name: aircraft
dtype: int64
- name: allergy_1
dtype: int64
- name: apartment_rentals
dtype: int64
- name: architecture
dtype: int64
- name: art_1
dtype: int64
- name: assets_maintenance
dtype: int64
- name: bakery_1
dtype: int64
- name: baseball_1
dtype: int64
- name: battle_death
dtype: int64
- name: bbc_channels
dtype: int64
- name: behavior_monitoring
dtype: int64
- name: bike_1
dtype: int64
- name: bike_racing
dtype: int64
- name: boat_1
dtype: int64
- name: body_builder
dtype: int64
- name: book_1
dtype: int64
- name: book_2
dtype: int64
- name: book_press
dtype: int64
- name: book_review
dtype: int64
- name: browser_web
dtype: int64
- name: candidate_poll
dtype: int64
- name: car_1
dtype: int64
- name: car_racing
dtype: int64
- name: car_road_race
dtype: int64
- name: chinook_1
dtype: int64
- name: cinema
dtype: int64
- name: city_record
dtype: int64
- name: climbing
dtype: int64
- name: club_1
dtype: int64
- name: club_leader
dtype: int64
- name: coffee_shop
dtype: int64
- name: college_1
dtype: int64
- name: college_2
dtype: int64
- name: college_3
dtype: int64
- name: company_1
dtype: int64
- name: company_employee
dtype: int64
- name: company_office
dtype: int64
- name: concert_singer
dtype: int64
- name: conference
dtype: int64
- name: country_language
dtype: int64
- name: county_public_safety
dtype: int64
- name: course_teach
dtype: int64
- name: cre_Doc_Control_Systems
dtype: int64
- name: cre_Doc_Template_Mgt
dtype: int64
- name: cre_Doc_Tracking_DB
dtype: int64
- name: cre_Doc_Workflow
dtype: int64
- name: cre_Doc_and_collections
dtype: int64
- name: cre_Docs_and_Epenses
dtype: int64
- name: cre_Drama_Workshop_Groups
dtype: int64
- name: cre_Students_Information_Systems
dtype: int64
- name: cre_Theme_park
dtype: int64
- name: csu_1
dtype: int64
- name: culture_company
dtype: int64
- name: customer_complaints
dtype: int64
- name: customer_deliveries
dtype: int64
- name: customers_and_addresses
dtype: int64
- name: customers_and_invoices
dtype: int64
- name: customers_and_orders
dtype: int64
- name: customers_and_products_contacts
dtype: int64
- name: customers_campaigns_ecommerce
dtype: int64
- name: customers_card_transactions
dtype: int64
- name: debate
dtype: int64
- name: decoration_competition
dtype: int64
- name: department_management
dtype: int64
- name: department_store
dtype: int64
- name: device
dtype: int64
- name: district_spokesman
dtype: int64
- name: document_management
dtype: int64
- name: dog_kennels
dtype: int64
- name: dorm_1
dtype: int64
- name: driving_school
dtype: int64
- name: e_commerce
dtype: int64
- name: e_government
dtype: int64
- name: e_learning
dtype: int64
- name: election
dtype: int64
- name: election_representative
dtype: int64
- name: employee_hire_evaluation
dtype: int64
- name: entertainment_awards
dtype: int64
- name: entrepreneur
dtype: int64
- name: epinions_1
dtype: int64
- name: farm
dtype: int64
- name: film_rank
dtype: int64
- name: flight_1
dtype: int64
- name: flight_2
dtype: int64
- name: flight_4
dtype: int64
- name: flight_company
dtype: int64
- name: formula_1
dtype: int64
- name: game_1
dtype: int64
- name: game_injury
dtype: int64
- name: gas_company
dtype: int64
- name: government_shift
dtype: int64
- name: gymnast
dtype: int64
- name: headphone_store
dtype: int64
- name: hospital_1
dtype: int64
- name: hr_1
dtype: int64
- name: icfp_1
dtype: int64
- name: inn_1
dtype: int64
- name: institution_sports
dtype: int64
- name: insurance_and_eClaims
dtype: int64
- name: insurance_fnol
dtype: int64
- name: insurance_policies
dtype: int64
- name: journal_committee
dtype: int64
- name: loan_1
dtype: int64
- name: local_govt_and_lot
dtype: int64
- name: local_govt_in_alabama
dtype: int64
- name: local_govt_mdm
dtype: int64
- name: machine_repair
dtype: int64
- name: manufactory_1
dtype: int64
- name: manufacturer
dtype: int64
- name: match_season
dtype: int64
- name: medicine_enzyme_interaction
dtype: int64
- name: mountain_photos
dtype: int64
- name: movie_1
dtype: int64
- name: movie_2
dtype: int64
- name: museum_visit
dtype: int64
- name: music_1
dtype: int64
- name: music_2
dtype: int64
- name: music_4
dtype: int64
- name: musical
dtype: int64
- name: network_1
dtype: int64
- name: network_2
dtype: int64
- name: news_report
dtype: int64
- name: online_exams
dtype: int64
- name: orchestra
dtype: int64
- name: party_host
dtype: int64
- name: party_people
dtype: int64
- name: performance_attendance
dtype: int64
- name: perpetrator
dtype: int64
- name: pets_1
dtype: int64
- name: phone_1
dtype: int64
- name: phone_market
dtype: int64
- name: pilot_1
dtype: int64
- name: pilot_record
dtype: int64
- name: planet_1
dtype: int64
- name: poker_player
dtype: int64
- name: product_catalog
dtype: int64
- name: products_for_hire
dtype: int64
- name: products_gen_characteristics
dtype: int64
- name: program_share
dtype: int64
- name: protein_institute
dtype: int64
- name: race_track
dtype: int64
- name: railway
dtype: int64
- name: real_estate_properties
dtype: int64
- name: real_estate_rentals
dtype: int64
- name: region_building
dtype: int64
- name: restaurant_1
dtype: int64
- name: restaurant_bills
dtype: int64
- name: riding_club
dtype: int64
- name: roller_coaster
dtype: int64
- name: sakila_1
dtype: int64
- name: school_bus
dtype: int64
- name: school_finance
dtype: int64
- name: school_player
dtype: int64
- name: scientist_1
dtype: int64
- name: ship_1
dtype: int64
- name: ship_mission
dtype: int64
- name: shop_membership
dtype: int64
- name: sing_contest
dtype: int64
- name: singer
dtype: int64
- name: small_bank_1
dtype: int64
- name: soccer_1
dtype: int64
- name: soccer_2
dtype: int64
- name: soccer_3
dtype: int64
- name: solvency_ii
dtype: int64
- name: sports_competition
dtype: int64
- name: station_weather
dtype: int64
- name: store_1
dtype: int64
- name: store_product
dtype: int64
- name: storm_record
dtype: int64
- name: student_1
dtype: int64
- name: student_assessment
dtype: int64
- name: student_transcripts_tracking
dtype: int64
- name: swimming
dtype: int64
- name: theme_gallery
dtype: int64
- name: tracking_grants_for_research
dtype: int64
- name: tracking_orders
dtype: int64
- name: tracking_share_transactions
dtype: int64
- name: tracking_software_problems
dtype: int64
- name: train_station
dtype: int64
- name: tv_shows
dtype: int64
- name: tvshow
dtype: int64
- name: twitter_1
dtype: int64
- name: university_basketball
dtype: int64
- name: university_rank
dtype: int64
- name: vehicle_driver
dtype: int64
- name: vehicle_rent
dtype: int64
- name: video_game
dtype: int64
- name: voter_1
dtype: int64
- name: voter_2
dtype: int64
- name: warehouse_1
dtype: int64
- name: wedding
dtype: int64
- name: wine_1
dtype: int64
- name: workshop_paper
dtype: int64
- name: world_1
dtype: int64
- name: wrestler
dtype: int64
- name: wta_1
dtype: int64
- name: query_template
dtype: string
splits:
- name: train
num_bytes: 2793662
num_examples: 1610
download_size: 203205
dataset_size: 2793662
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
- **datasource_id**: The Huggingface Dataset where the template originated from.
- **query_template_id**: A unique id tied to the datasource_id.
- **database_ids**: A struct that maps dataset names to the count of occurrences of the template for that dataset.
- **query_template**: The query template value.
|
hibiki-org/yodas-tts | ---
license: cc-by-3.0
---
|
HuggingFaceM4/ChartQA | ---
dataset_info:
features:
- name: image
dtype: image
- name: query
dtype: string
- name: label
list: string
- name: human_or_machine
dtype:
class_label:
names:
'0': human
'1': machine
splits:
- name: train
num_bytes: 1256446073.625
num_examples: 28299
- name: val
num_bytes: 84202126
num_examples: 1920
- name: test
num_bytes: 107049156.5
num_examples: 2500
download_size: 964095599
dataset_size: 1447697356.125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
license: gpl-3.0
---
# Dataset Card for "ChartQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.62 | ---
pretty_name: Evaluation run of lemon-mint/gemma-ko-7b-instruct-v0.62
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lemon-mint/gemma-ko-7b-instruct-v0.62](https://huggingface.co/lemon-mint/gemma-ko-7b-instruct-v0.62)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.62\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T13:31:01.310721](https://huggingface.co/datasets/open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.62/blob/main/results_2024-04-03T13-31-01.310721.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6452999467633729,\n\
\ \"acc_stderr\": 0.03207976395932362,\n \"acc_norm\": 0.6462746893095774,\n\
\ \"acc_norm_stderr\": 0.032735425781736015,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5111719354398984,\n\
\ \"mc2_stderr\": 0.015385094353614891\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225405,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.0140841331181043\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5908185620394344,\n\
\ \"acc_stderr\": 0.004906779523192673,\n \"acc_norm\": 0.7961561441943836,\n\
\ \"acc_norm_stderr\": 0.004020309191221551\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4523809523809524,\n \"acc_stderr\": 0.02563425811555495,\n \"\
acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.02563425811555495\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.02354079935872329,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.02354079935872329\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830506,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830506\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606646,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606646\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976064,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078955,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078955\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383602,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383602\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.01322392861674162,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.01322392861674162\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.1888268156424581,\n\
\ \"acc_stderr\": 0.013089403869745457,\n \"acc_norm\": 0.1888268156424581,\n\
\ \"acc_norm_stderr\": 0.013089403869745457\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02736359328468497,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02736359328468497\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.500651890482399,\n\
\ \"acc_stderr\": 0.012770225252255546,\n \"acc_norm\": 0.500651890482399,\n\
\ \"acc_norm_stderr\": 0.012770225252255546\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.01935336054755369,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.01935336054755369\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5111719354398984,\n\
\ \"mc2_stderr\": 0.015385094353614891\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827943\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7156937073540561,\n \
\ \"acc_stderr\": 0.01242507818839599\n }\n}\n```"
repo_url: https://huggingface.co/lemon-mint/gemma-ko-7b-instruct-v0.62
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|arc:challenge|25_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|arc:challenge|25_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|gsm8k|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|gsm8k|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hellaswag|10_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hellaswag|10_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T13-24-07.246854.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T13-31-01.310721.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T13-31-01.310721.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- '**/details_harness|winogrande|5_2024-04-03T13-24-07.246854.parquet'
- split: 2024_04_03T13_31_01.310721
path:
- '**/details_harness|winogrande|5_2024-04-03T13-31-01.310721.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T13-31-01.310721.parquet'
- config_name: results
data_files:
- split: 2024_04_03T13_24_07.246854
path:
- results_2024-04-03T13-24-07.246854.parquet
- split: 2024_04_03T13_31_01.310721
path:
- results_2024-04-03T13-31-01.310721.parquet
- split: latest
path:
- results_2024-04-03T13-31-01.310721.parquet
---
# Dataset Card for Evaluation run of lemon-mint/gemma-ko-7b-instruct-v0.62
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lemon-mint/gemma-ko-7b-instruct-v0.62](https://huggingface.co/lemon-mint/gemma-ko-7b-instruct-v0.62) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.62",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T13:31:01.310721](https://huggingface.co/datasets/open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.62/blob/main/results_2024-04-03T13-31-01.310721.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6452999467633729,
"acc_stderr": 0.03207976395932362,
"acc_norm": 0.6462746893095774,
"acc_norm_stderr": 0.032735425781736015,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5111719354398984,
"mc2_stderr": 0.015385094353614891
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225405,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.0140841331181043
},
"harness|hellaswag|10": {
"acc": 0.5908185620394344,
"acc_stderr": 0.004906779523192673,
"acc_norm": 0.7961561441943836,
"acc_norm_stderr": 0.004020309191221551
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.02563425811555495,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.02563425811555495
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830506,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830506
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606646,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606646
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976064,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078955,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078955
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383602,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383602
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.01322392861674162,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.01322392861674162
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.1888268156424581,
"acc_stderr": 0.013089403869745457,
"acc_norm": 0.1888268156424581,
"acc_norm_stderr": 0.013089403869745457
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.02736359328468497,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.02736359328468497
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.500651890482399,
"acc_stderr": 0.012770225252255546,
"acc_norm": 0.500651890482399,
"acc_norm_stderr": 0.012770225252255546
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.01935336054755369,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.01935336054755369
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5111719354398984,
"mc2_stderr": 0.015385094353614891
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827943
},
"harness|gsm8k|5": {
"acc": 0.7156937073540561,
"acc_stderr": 0.01242507818839599
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
PlanTL-GOB-ES/cantemist-ner | ---
annotations_creators:
- expert-generated
language:
- es
tags:
- biomedical
- clinical
- spanish
multilinguality:
- monolingual
task_categories:
- token-classification
task_ids:
- named-entity-recognition
license:
- cc-by-4.0
---
# CANTEMIST
## Dataset Description
Manually classified collection of Spanish oncological clinical case reports.
- **Homepage:** [zenodo](https://zenodo.org/record/3978041)
- **Paper:** [Named Entity Recognition, Concept Normalization and Clinical Coding: Overview of the Cantemist Track for Cancer Text Mining in Spanish, Corpus, Guidelines, Methods and Results](https://www.researchgate.net/profile/Antonio-Miranda-Escalada-2/publication/352786464_Named_Entity_Recognition_Concept_Normalization_and_Clinical_Coding_Overview_of_the_Cantemist_Track_for_Cancer_Text_Mining_in_Spanish_Corpus_Guidelines_Methods_and_Results/links/60d98a3b458515d6fbe382d8/Named-Entity-Recognition-Concept-Normalization-and-Clinical-Coding-Overview-of-the-Cantemist-Track-for-Cancer-Text-Mining-in-Spanish-Corpus-Guidelines-Methods-and-Results.pdf)
- **Point of Contact:** encargo-pln-life@bsc.es
### Dataset Summary
Collection of 1301 oncological clinical case reports written in Spanish, with tumor morphology mentions manually annotated and mapped by clinical experts to a controlled terminology. Every tumor morphology mention is linked to an eCIE-O code (the Spanish equivalent of ICD-O).
The training subset contains 501 documents, the development subsets 500, and the test subset 300. The original dataset is distributed in [Brat](https://brat.nlplab.org/standoff.html) format.
This dataset was designed for the CANcer TExt Mining Shared Task, sponsored by [Plan-TL](https://plantl.mineco.gob.es/Paginas/index.aspx).
For further information, please visit [the official website](https://temu.bsc.es/cantemist/).
### Supported Tasks
Named Entity Recognition (NER)
### Languages
- Spanish (es)
### Directory Structure
* README.md
* cantemist.py
* train.conll
* dev.conll
* test.conll
## Dataset Structure
### Data Instances
Three four-column files, one for each split.
### Data Fields
Every file has 4 columns:
* 1st column: Word form or punctuation symbol
* 2nd column: Original BRAT file name
* 3rd column: Spans
* 4th column: IOB tag
#### Example
<pre>
El cc_onco101 662_664 O
informe cc_onco101 665_672 O
HP cc_onco101 673_675 O
es cc_onco101 676_678 O
compatible cc_onco101 679_689 O
con cc_onco101 690_693 O
adenocarcinoma cc_onco101 694_708 B-MORFOLOGIA_NEOPLASIA
moderadamente cc_onco101 709_722 I-MORFOLOGIA_NEOPLASIA
diferenciado cc_onco101 723_735 I-MORFOLOGIA_NEOPLASIA
que cc_onco101 736_739 O
afecta cc_onco101 740_746 O
a cc_onco101 747_748 O
grasa cc_onco101 749_754 O
peripancreática cc_onco101 755_770 O
sobrepasando cc_onco101 771_783 O
la cc_onco101 784_786 O
serosa cc_onco101 787_793 O
, cc_onco101 793_794 O
infiltración cc_onco101 795_807 O
perineural cc_onco101 808_818 O
. cc_onco101 818_819 O
</pre>
### Data Splits
| Split | Size |
| ------------- | ------------- |
| `train` | 19,397 |
| `dev` | 18,165 |
| `test` | 11,168 |
## Dataset Creation
### Curation Rationale
For compatibility with similar datasets in other languages, we followed as close as possible existing curation guidelines.
### Source Data
#### Initial Data Collection and Normalization
The selected clinical case reports are fairly similar to hospital health records. To increase the usefulness and practical relevance of the CANTEMIST corpus, we selected clinical cases affecting all genders and that comprised most ages (from children to the elderly) and of various complexity levels (solid tumors, hemato-oncological malignancies, neuroendocrine cancer...).
The CANTEMIST cases include clinical signs and symptoms, personal and family history, current illness, physical examination, complementary tests (blood tests, imaging, pathology), diagnosis, treatment (including adverse effects of chemotherapy), evolution and outcome.
#### Who are the source language producers?
Humans, there is no machine generated data.
### Annotations
#### Annotation process
The manual annotation of the Cantemist corpus was performed by clinical experts following the Cantemist guidelines (for more detail refer to this [paper](http://ceur-ws.org/Vol-2664/cantemist_overview.pdf)). These guidelines contain rules for annotating morphology neoplasms in Spanish oncology clinical cases, as well as for mapping these annotations to eCIE-O.
A medical doctor was regularly consulted by annotators (scientists with PhDs on cancer-related subjects) for the most difficult pathology expressions. This same doctor periodically checked a random selection of annotated clinical records and these annotations were compared and discussed with the annotators. To normalize a selection of very complex cases, MD specialists in pathology from one of the largest university hospitals in Spain were consulted.
#### Who are the annotators?
Clinical experts.
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
This corpus contributes to the development of medical language models in Spanish.
### Discussion of Biases
Not applicable.
## Additional Information
### Dataset Curators
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@bsc.es).
For further information, send an email to (plantl-gob-es@bsc.es).
This work was funded by the [Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA)](https://avancedigital.mineco.gob.es/en-us/Paginas/index.aspx) within the framework of the [Plan-TL](https://plantl.mineco.gob.es/Paginas/index.aspx).
### Licensing information
This work is licensed under [CC Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/) License.
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Citation Information
```bibtex
@article{cantemist,
title={Named Entity Recognition, Concept Normalization and Clinical Coding: Overview of the Cantemist Track for Cancer Text Mining in Spanish, Corpus, Guidelines, Methods and Results.},
author={Miranda-Escalada, Antonio and Farr{\'e}, Eul{\`a}lia and Krallinger, Martin},
journal={IberLEF@ SEPLN},
pages={303--323},
year={2020}
}
```
### Contributions
[N/A]
|
newsph | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- fil
- tl
license:
- gpl-3.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: newsph-nli
pretty_name: NewsPH-NLI
dataset_info:
features:
- name: text
dtype: string
config_name: newsph
splits:
- name: train
num_bytes: 298833914
num_examples: 2190465
download_size: 104086466
dataset_size: 298833914
---
# Dataset Card for NewsPH
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Filipino Text Benchmarks](https://github.com/jcblaisecruz02/Filipino-Text-Benchmarks)
- **Repository:**
- **Paper:** [Investigating the True Performance of Transformers in Low-Resource Languages: A Case Study in Automatic Corpus Creation](https://arxiv.org/abs/2010.11574)
- **Leaderboard:**
- **Point of Contact:** [Jan Christian Blaise Cruz](jan_christian_cruz@dlsu.edu.ph)
### Dataset Summary
Raw collection of news articles in Filipino. Used to produce the NewsPH-NLI dataset in Cruz et al. (2020)
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Tagalog/Filipino
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
- `text` (`str`)
The dataset is in plaintext and only has one field ("text"). It can be used for language modeling.
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@jcblaisecruz02](https://github.com/jcblaisecruz02) for adding this dataset. |
open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b | ---
pretty_name: Evaluation run of AI-Sweden-Models/gpt-sw3-40b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AI-Sweden-Models/gpt-sw3-40b](https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T15:00:33.518629](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b/blob/main/results_2023-12-04T15-00-33.518629.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.354049089015047,\n\
\ \"acc_stderr\": 0.033417526940887884,\n \"acc_norm\": 0.35741274788934346,\n\
\ \"acc_norm_stderr\": 0.03422297037663694,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396728,\n \"mc2\": 0.3752465265593006,\n\
\ \"mc2_stderr\": 0.013533322814931005\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.40102389078498296,\n \"acc_stderr\": 0.01432225579071987,\n\
\ \"acc_norm\": 0.4300341296928328,\n \"acc_norm_stderr\": 0.014467631559137993\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.535749850627365,\n\
\ \"acc_stderr\": 0.004977010670436551,\n \"acc_norm\": 0.7236606253734316,\n\
\ \"acc_norm_stderr\": 0.004462727543055892\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361063,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361063\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3471698113207547,\n \"acc_stderr\": 0.029300101705549655,\n\
\ \"acc_norm\": 0.3471698113207547,\n \"acc_norm_stderr\": 0.029300101705549655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.03618664819936245,\n\
\ \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.03618664819936245\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.038924311065187546,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.038924311065187546\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.04028731532947559,\n\
\ \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.04028731532947559\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36129032258064514,\n\
\ \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.36129032258064514,\n\
\ \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.503030303030303,\n \"acc_stderr\": 0.03904272341431857,\n\
\ \"acc_norm\": 0.503030303030303,\n \"acc_norm_stderr\": 0.03904272341431857\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3434343434343434,\n \"acc_stderr\": 0.03383201223244442,\n \"\
acc_norm\": 0.3434343434343434,\n \"acc_norm_stderr\": 0.03383201223244442\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.41450777202072536,\n \"acc_stderr\": 0.035553003195576735,\n\
\ \"acc_norm\": 0.41450777202072536,\n \"acc_norm_stderr\": 0.035553003195576735\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28717948717948716,\n \"acc_stderr\": 0.02293992541853062,\n\
\ \"acc_norm\": 0.28717948717948716,\n \"acc_norm_stderr\": 0.02293992541853062\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895992,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895992\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.42752293577981654,\n \"acc_stderr\": 0.02121091020430043,\n \"\
acc_norm\": 0.42752293577981654,\n \"acc_norm_stderr\": 0.02121091020430043\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.02596742095825853,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.02596742095825853\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.47549019607843135,\n \"acc_stderr\": 0.03505093194348798,\n \"\
acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.03505093194348798\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4767932489451477,\n \"acc_stderr\": 0.032512152011410174,\n \
\ \"acc_norm\": 0.4767932489451477,\n \"acc_norm_stderr\": 0.032512152011410174\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.32286995515695066,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04750077341199986,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04750077341199986\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755803,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755803\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4563106796116505,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.4563106796116505,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4188034188034188,\n\
\ \"acc_stderr\": 0.03232128912157792,\n \"acc_norm\": 0.4188034188034188,\n\
\ \"acc_norm_stderr\": 0.03232128912157792\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4329501915708812,\n\
\ \"acc_stderr\": 0.017718469101513982,\n \"acc_norm\": 0.4329501915708812,\n\
\ \"acc_norm_stderr\": 0.017718469101513982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3265895953757225,\n \"acc_stderr\": 0.025248264774242826,\n\
\ \"acc_norm\": 0.3265895953757225,\n \"acc_norm_stderr\": 0.025248264774242826\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3464052287581699,\n \"acc_stderr\": 0.02724561304721536,\n\
\ \"acc_norm\": 0.3464052287581699,\n \"acc_norm_stderr\": 0.02724561304721536\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3504823151125402,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.3504823151125402,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.026869490744815257,\n\
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.026869490744815257\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33702737940026073,\n\
\ \"acc_stderr\": 0.012072836273691327,\n \"acc_norm\": 0.33702737940026073,\n\
\ \"acc_norm_stderr\": 0.012072836273691327\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2536764705882353,\n \"acc_stderr\": 0.02643132987078954,\n\
\ \"acc_norm\": 0.2536764705882353,\n \"acc_norm_stderr\": 0.02643132987078954\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.34967320261437906,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.34967320261437906,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n\
\ \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n\
\ \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4427860696517413,\n\
\ \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.4427860696517413,\n\
\ \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.47953216374269003,\n \"acc_stderr\": 0.038316105328219316,\n\
\ \"acc_norm\": 0.47953216374269003,\n \"acc_norm_stderr\": 0.038316105328219316\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396728,\n \"mc2\": 0.3752465265593006,\n\
\ \"mc2_stderr\": 0.013533322814931005\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6795580110497238,\n \"acc_stderr\": 0.013115085457681709\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \
\ \"acc_stderr\": 0.005829898355937188\n }\n}\n```"
repo_url: https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|arc:challenge|25_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|gsm8k|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hellaswag|10_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-00-33.518629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T15-00-33.518629.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- '**/details_harness|winogrande|5_2023-12-04T15-00-33.518629.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T15-00-33.518629.parquet'
- config_name: results
data_files:
- split: 2023_12_04T15_00_33.518629
path:
- results_2023-12-04T15-00-33.518629.parquet
- split: latest
path:
- results_2023-12-04T15-00-33.518629.parquet
---
# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-40b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AI-Sweden-Models/gpt-sw3-40b](https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T15:00:33.518629](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b/blob/main/results_2023-12-04T15-00-33.518629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.354049089015047,
"acc_stderr": 0.033417526940887884,
"acc_norm": 0.35741274788934346,
"acc_norm_stderr": 0.03422297037663694,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396728,
"mc2": 0.3752465265593006,
"mc2_stderr": 0.013533322814931005
},
"harness|arc:challenge|25": {
"acc": 0.40102389078498296,
"acc_stderr": 0.01432225579071987,
"acc_norm": 0.4300341296928328,
"acc_norm_stderr": 0.014467631559137993
},
"harness|hellaswag|10": {
"acc": 0.535749850627365,
"acc_stderr": 0.004977010670436551,
"acc_norm": 0.7236606253734316,
"acc_norm_stderr": 0.004462727543055892
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03583496176361063,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03583496176361063
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3471698113207547,
"acc_stderr": 0.029300101705549655,
"acc_norm": 0.3471698113207547,
"acc_norm_stderr": 0.029300101705549655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3541666666666667,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.3541666666666667,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.034564257450869995,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.034564257450869995
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.03618664819936245,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.03618664819936245
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.038924311065187546,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.038924311065187546
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3724137931034483,
"acc_stderr": 0.04028731532947559,
"acc_norm": 0.3724137931034483,
"acc_norm_stderr": 0.04028731532947559
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36129032258064514,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.36129032258064514,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.503030303030303,
"acc_stderr": 0.03904272341431857,
"acc_norm": 0.503030303030303,
"acc_norm_stderr": 0.03904272341431857
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3434343434343434,
"acc_stderr": 0.03383201223244442,
"acc_norm": 0.3434343434343434,
"acc_norm_stderr": 0.03383201223244442
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.41450777202072536,
"acc_stderr": 0.035553003195576735,
"acc_norm": 0.41450777202072536,
"acc_norm_stderr": 0.035553003195576735
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28717948717948716,
"acc_stderr": 0.02293992541853062,
"acc_norm": 0.28717948717948716,
"acc_norm_stderr": 0.02293992541853062
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895992,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895992
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42752293577981654,
"acc_stderr": 0.02121091020430043,
"acc_norm": 0.42752293577981654,
"acc_norm_stderr": 0.02121091020430043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.02596742095825853,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.02596742095825853
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.03505093194348798,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.03505093194348798
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4767932489451477,
"acc_stderr": 0.032512152011410174,
"acc_norm": 0.4767932489451477,
"acc_norm_stderr": 0.032512152011410174
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04750077341199986,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04750077341199986
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755803,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755803
},
"harness|hendrycksTest-management|5": {
"acc": 0.4563106796116505,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.4563106796116505,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4188034188034188,
"acc_stderr": 0.03232128912157792,
"acc_norm": 0.4188034188034188,
"acc_norm_stderr": 0.03232128912157792
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4329501915708812,
"acc_stderr": 0.017718469101513982,
"acc_norm": 0.4329501915708812,
"acc_norm_stderr": 0.017718469101513982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3265895953757225,
"acc_stderr": 0.025248264774242826,
"acc_norm": 0.3265895953757225,
"acc_norm_stderr": 0.025248264774242826
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3464052287581699,
"acc_stderr": 0.02724561304721536,
"acc_norm": 0.3464052287581699,
"acc_norm_stderr": 0.02724561304721536
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3504823151125402,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.3504823151125402,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.026869490744815257,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.026869490744815257
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33702737940026073,
"acc_stderr": 0.012072836273691327,
"acc_norm": 0.33702737940026073,
"acc_norm_stderr": 0.012072836273691327
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2536764705882353,
"acc_stderr": 0.02643132987078954,
"acc_norm": 0.2536764705882353,
"acc_norm_stderr": 0.02643132987078954
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.34967320261437906,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.34967320261437906,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.04750185058907297,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.04750185058907297
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.47346938775510206,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.47346938775510206,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4427860696517413,
"acc_stderr": 0.03512310964123937,
"acc_norm": 0.4427860696517413,
"acc_norm_stderr": 0.03512310964123937
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.0371172519074075,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.0371172519074075
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.47953216374269003,
"acc_stderr": 0.038316105328219316,
"acc_norm": 0.47953216374269003,
"acc_norm_stderr": 0.038316105328219316
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396728,
"mc2": 0.3752465265593006,
"mc2_stderr": 0.013533322814931005
},
"harness|winogrande|5": {
"acc": 0.6795580110497238,
"acc_stderr": 0.013115085457681709
},
"harness|gsm8k|5": {
"acc": 0.04700530705079606,
"acc_stderr": 0.005829898355937188
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/joy_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of joy (Pokémon)
This is the dataset of joy (Pokémon), containing 230 images and their tags.
The core tags of this character are `pink_hair, hat, nurse_cap, blue_eyes, breasts, hair_rings, long_hair, white_headwear, eyelashes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 230 | 186.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joy_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 230 | 121.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joy_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 458 | 223.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joy_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 230 | 171.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joy_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 458 | 296.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joy_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/joy_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, nurse, smile, solo, looking_at_viewer, short_sleeves, apron, pink_dress, blush, open_mouth, bangs, full_body, standing, closed_mouth, own_hands_together, shoes, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, nurse, pink_dress, bangs, collared_dress, open_mouth, solo, white_apron, simple_background, white_background, :d, puffy_short_sleeves, upper_body |
| 2 | 15 |  |  |  |  |  | 1girl, blush, navel, nipples, collarbone, nude, solo, large_breasts, looking_at_viewer, open_mouth, simple_background, nurse, pussy, white_background, :d, shiny, tongue |
| 3 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, pussy, vaginal, large_breasts, nurse, open_mouth, sex, uncensored, spread_legs, blush, nude, solo_focus, thighhighs, clitoris, navel, veiny_penis |
| 4 | 6 |  |  |  |  |  | 1girl, barefoot, shiny_hair, shiny_skin, toes, blue_bikini, collarbone, dark-skinned_female, bangs, cleavage, closed_mouth, looking_at_viewer, navel, smile, solo, bare_arms, full_body, knees, sitting, tan, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | nurse | smile | solo | looking_at_viewer | short_sleeves | apron | pink_dress | blush | open_mouth | bangs | full_body | standing | closed_mouth | own_hands_together | shoes | white_background | collared_dress | white_apron | simple_background | :d | puffy_short_sleeves | upper_body | navel | nipples | collarbone | nude | large_breasts | pussy | shiny | tongue | 1boy | hetero | vaginal | sex | uncensored | spread_legs | solo_focus | thighhighs | clitoris | veiny_penis | barefoot | shiny_hair | shiny_skin | toes | blue_bikini | dark-skinned_female | cleavage | bare_arms | knees | sitting | tan |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-------|:--------------------|:----------------|:--------|:-------------|:--------|:-------------|:--------|:------------|:-----------|:---------------|:---------------------|:--------|:-------------------|:-----------------|:--------------|:--------------------|:-----|:----------------------|:-------------|:--------|:----------|:-------------|:-------|:----------------|:--------|:--------|:---------|:-------|:---------|:----------|:------|:-------------|:--------------|:-------------|:-------------|:-----------|:--------------|:-----------|:-------------|:-------------|:-------|:--------------|:----------------------|:-----------|:------------|:--------|:----------|:------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | X | | | | X | | X | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | | X | X | | | | X | X | | | | | | | X | | | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | | | | | | X | X | | | | | | | | | | | | | | X | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | X | X | | | | | | X | X | | X | | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
EasyTerms/Manuel_dataset | ---
license: apache-2.0
task_categories:
- summarization
language:
- en
tags:
- legal
pretty_name: lilo
size_categories:
- n<1K
---
|
liuyanchen1015/MULTI_VALUE_cola_fronting_pobj | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 21984
num_examples: 302
- name: test
num_bytes: 23552
num_examples: 324
- name: train
num_bytes: 204589
num_examples: 2747
download_size: 122071
dataset_size: 250125
---
# Dataset Card for "MULTI_VALUE_cola_fronting_pobj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/hub_issues_first_comment | Invalid username or password. |
enobyte/admedika-base | ---
license: apache-2.0
---
|
LeslieC21/Mr_Red | ---
license: other
---
|
open-llm-leaderboard/details_Weyaxi__Einstein-v4-phi2 | ---
pretty_name: Evaluation run of Weyaxi/Einstein-v4-phi2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Einstein-v4-phi2](https://huggingface.co/Weyaxi/Einstein-v4-phi2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Einstein-v4-phi2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T10:59:28.686847](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-v4-phi2/blob/main/results_2024-04-15T10-59-28.686847.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5712227421870523,\n\
\ \"acc_stderr\": 0.03378049374253148,\n \"acc_norm\": 0.5723551969712353,\n\
\ \"acc_norm_stderr\": 0.03447181876029467,\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.45802811774607116,\n\
\ \"mc2_stderr\": 0.015073729059736581\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.014434138713379976,\n\
\ \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5599482174865564,\n\
\ \"acc_stderr\": 0.0049537871465109365,\n \"acc_norm\": 0.7406891057558256,\n\
\ \"acc_norm_stderr\": 0.004373608212561021\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.032685726586674915,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517414,\n \"\
acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517414\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.0316187791793541,\n\
\ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.0316187791793541\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.025189149894764198,\n\
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.025189149894764198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.03201650100739611,\n \
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.03201650100739611\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.781651376146789,\n \"acc_stderr\": 0.017712600528722724,\n \"\
acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722724\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460285,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460285\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520981,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520981\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6871008939974457,\n\
\ \"acc_stderr\": 0.01658093594030406,\n \"acc_norm\": 0.6871008939974457,\n\
\ \"acc_norm_stderr\": 0.01658093594030406\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688235,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688235\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29832402234636873,\n\
\ \"acc_stderr\": 0.015301840045129276,\n \"acc_norm\": 0.29832402234636873,\n\
\ \"acc_norm_stderr\": 0.015301840045129276\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347813,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839796,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839796\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.02946218923337061,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.02946218923337061\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n\
\ \"acc_stderr\": 0.01261560047573492,\n \"acc_norm\": 0.42242503259452413,\n\
\ \"acc_norm_stderr\": 0.01261560047573492\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.030254372573976687,\n\
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.030254372573976687\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02011692534742242,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02011692534742242\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.03036049015401464,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.03036049015401464\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.45802811774607116,\n\
\ \"mc2_stderr\": 0.015073729059736581\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7387529597474349,\n \"acc_stderr\": 0.01234691486341531\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5398028809704322,\n \
\ \"acc_stderr\": 0.01372877671409937\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Einstein-v4-phi2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|arc:challenge|25_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|gsm8k|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hellaswag|10_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-59-28.686847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T10-59-28.686847.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- '**/details_harness|winogrande|5_2024-04-15T10-59-28.686847.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T10-59-28.686847.parquet'
- config_name: results
data_files:
- split: 2024_04_15T10_59_28.686847
path:
- results_2024-04-15T10-59-28.686847.parquet
- split: latest
path:
- results_2024-04-15T10-59-28.686847.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Einstein-v4-phi2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-v4-phi2](https://huggingface.co/Weyaxi/Einstein-v4-phi2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Einstein-v4-phi2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T10:59:28.686847](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-v4-phi2/blob/main/results_2024-04-15T10-59-28.686847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5712227421870523,
"acc_stderr": 0.03378049374253148,
"acc_norm": 0.5723551969712353,
"acc_norm_stderr": 0.03447181876029467,
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.45802811774607116,
"mc2_stderr": 0.015073729059736581
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.014434138713379976,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809174
},
"harness|hellaswag|10": {
"acc": 0.5599482174865564,
"acc_stderr": 0.0049537871465109365,
"acc_norm": 0.7406891057558256,
"acc_norm_stderr": 0.004373608212561021
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7409326424870466,
"acc_stderr": 0.0316187791793541,
"acc_norm": 0.7409326424870466,
"acc_norm_stderr": 0.0316187791793541
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.025189149894764198,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.025189149894764198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.03201650100739611,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.03201650100739611
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.781651376146789,
"acc_stderr": 0.017712600528722724,
"acc_norm": 0.781651376146789,
"acc_norm_stderr": 0.017712600528722724
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460285,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460285
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520981,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520981
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6871008939974457,
"acc_stderr": 0.01658093594030406,
"acc_norm": 0.6871008939974457,
"acc_norm_stderr": 0.01658093594030406
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688235,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688235
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29832402234636873,
"acc_stderr": 0.015301840045129276,
"acc_norm": 0.29832402234636873,
"acc_norm_stderr": 0.015301840045129276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347813,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.027272582849839796,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.027272582849839796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.02946218923337061,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.02946218923337061
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.01261560047573492,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.01261560047573492
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.030254372573976687,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.030254372573976687
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02011692534742242,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02011692534742242
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401464,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401464
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.45802811774607116,
"mc2_stderr": 0.015073729059736581
},
"harness|winogrande|5": {
"acc": 0.7387529597474349,
"acc_stderr": 0.01234691486341531
},
"harness|gsm8k|5": {
"acc": 0.5398028809704322,
"acc_stderr": 0.01372877671409937
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Nexdata/Saudi_Arabic_Spontaneous_Speech_Data | ---
task_categories:
- automatic-speech-recognition
language:
- ar
---
# Dataset Card for Nexdata/Saudi_Arabic_Spontaneous_Speech_Data
## Description
849 Hours - Saudi Arabic Spontaneous Speech Data, the content covering multiple topics. All the speech audio was manually transcribed into text content; speaker identity, gender, and other attribution are also annotated. This dataset can be used for voiceprint recognition model training, corpus construction for machine translation, and algorithm research introduction
For more details, please refer to the link: https://www.nexdata.ai/datasets/1150?source=Huggingface
# Specifications
## Format
16kHz, 16bit, wav, mono channel;
## Content category
including interview, variety show, live, etc.
## Language
Arabic;
## Annotation
annotation for the transcription text, speaker identification, gender;
## Application scenarios
speech recognition, video caption generation and video content review;
## Accuracy
at a Sentence Accuracy Rate (SAR) of being no less than 95%.
# Licensing Information
Commercial License |
akatoch/classicalmididata | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 40293850
num_examples: 9209
download_size: 7758901
dataset_size: 40293850
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vezora/Tested-143k-Python-Alpaca | ---
license: apache-2.0
---
Contributors: Nicolas Mejia Petit
# Vezora's CodeTester Dataset

## Introduction
Today, on March 6, 2024, we are excited to release our internal Python dataset with 143,327 examples of code. These examples have been meticulously tested and verified as working. Our dataset was created using a script we developed.
### Dataset Creation
- Our script operates by extracting Python code from the output section of Alpaca-formatted datasets. It tests each extracted piece of code, keeping it if it passes and removing it if it fails, then saves all the working code in a seperate dataset.
- Our second script works by removing the not working code from your alpaca datasets, and saves it to a not working code json, and then keeps all the working examples along with any other non python related examples, and saves it.
- !WARNING! these scripts run on your local computer's python enviroment, with mutithreading so it runs fast, if there is any malicious python code in your dataset, it WILL run on your local computer so run it in a VM. Lastly, it is required that you have python packages installed, just main ones most would have already installed but some like tkinter and other packages in order for certain lines of code to be tested.
- (if you are struggling converting your dataset to alpaca format, give the first three questions of both datasets and ask chat gpt or bing to give you a script to convert the dataset to that format you want. Might take one or two tries.)
- The creation of this dataset involved leveraging open source datasets from various sources, including Wizard-LM's Evol datasets, CodeUp's 19k, Sahils2801's Code Alpaca, Eric Heartford's Dolphin and dolphin coder sharegpt 290k, both ise uiuc's Magicoder dataset, and a selection of hand-prompted GPT-4 code questions. The resulting dataset was carefully deduplicated.
- We discovered that many of the open source datasets contained thousands of non-functional code examples, often plagued by module errors and other issues. Importantly, our script's approach is highly adaptable and could potentially be used to test code in other languages such as C++, C, SQL, and more.
### Usage Guidelines
We invested a significant amount of time in developing this script. If you intend to use it to extract functional code in your own projects or datasets, and or plan on using our dataset, please include the following attribution in your model's or dataset's repository:
"Filtered Using Vezora's CodeTester"
## Motivation
many openly trained datasets are contaminated with code that does not run. Code that you will be teaching a model, degrading its preformance. We release this dataset to suit as a large corpus of working python instruction and answer dataset. This current itteration has 141k examples of working code. Allowing a model to learn a wide range of python tasks as best as possible.
### Limitations of Foundational Models
It's essential to note that even when writing syntactically correct code, foundational models often lack access to up-to-date Python and API documentation. As a result, code generated by these models may contain errors stemming from outdated calls or methods.
## Building a Strong Python Code Model
If you aspire to build a robust Python code model, we recommend the following steps:
1. Pretrain with Mistral 7b on UPTODATE Python and API documentations. (during our testing we found even when a model writes syntactyically correct code it lacks up to date api calls and functions.)
2. Consider incorporating programming textbooks into your training.
3. Fine-tune your model with our dataset using SFT (Supervised Fine-Tuning).
In the future, we may also release our "not working" code dataset, allowing users to create a Discriminative Pretraining Objective (DPO) model to reward functional code over non-functional code. Although with the second script provided, it would be pretty easy to do it your self.
We hope this dataset serves as a valuable resource for the community and contributes to the improvement of code-related AI models.
|
g4drone/voz01 | ---
license: openrail
---
|
mtc/frank-test-set-with-faithfulness-annotation | ---
dataset_info:
features:
- name: hash
dtype: string
- name: model_name
dtype: string
- name: article
dtype: string
- name: summary
dtype: string
- name: reference
dtype: string
- name: summary_sentences
sequence: string
- name: summary_sentences_annotations
list:
- name: annotator_0
sequence: string
- name: annotator_1
sequence: string
- name: annotator_2
sequence: string
- name: split
dtype: string
- name: Factual
dtype: bool
- name: NoE_count
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: validation
num_bytes: 2555126
num_examples: 671
- name: test
num_bytes: 5688685
num_examples: 1575
download_size: 1477522
dataset_size: 8243811
---
# Dataset Card for "frank-test-set-with-factuality-annotation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
forag/webcpm_oe | ---
license: apache-2.0
task_categories:
- question-answering
language:
- zh
pretty_name: Outline Enhanced WebCPM Dataset
size_categories:
- 1K<n<10K
--- |
angeluriot/DimensionGPT_instruct | ---
configs:
- config_name: human_conversations
data_files: human_conversations.json
- config_name: chatbot_conversations
data_files: chatbot_conversations.json
- config_name: dimension_gpt_conversations
data_files: dimension_gpt_conversations.json
- config_name: human_preprompts
data_files: human_preprompts.json
- config_name: chatbot_preprompts
data_files: chatbot_preprompts.json
- config_name: dimension_gpt_preprompts
data_files: dimension_gpt_preprompts.json
--- |
patruff/chucklesMistralA | ---
dataset_info:
features:
- name: original
dtype: string
- name: chucklebot
dtype: string
splits:
- name: train
num_bytes: 232847
num_examples: 986
- name: test
num_bytes: 58801
num_examples: 247
download_size: 128425
dataset_size: 291648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Vxnkyyyyy/imdb_sentiment_finetune_dataset | ---
dataset_info:
features:
- name: review
dtype: string
- name: sentiment
dtype: int64
splits:
- name: train
num_bytes: 2588807
num_examples: 2000
- name: validation
num_bytes: 2667965
num_examples: 2000
- name: test
num_bytes: 21177655
num_examples: 16000
download_size: 17194624
dataset_size: 26434427
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
EnergyStarAI/summarization | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1752743
num_examples: 1000
download_size: 1115781
dataset_size: 1752743
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Thauab/voice552 | ---
license: openrail
---
|
gowitheflowlab/parallel-small-w-nli | ---
dataset_info:
features:
- name: English
dtype: string
- name: Other Language
dtype: string
splits:
- name: train
num_bytes: 1004127782
num_examples: 6645134
download_size: 612159821
dataset_size: 1004127782
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pedrohiperiq/datasetpica.com | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.